User:Cglife.bmarcus/sandbox

A cloud laboratory is a heavily automated, centralized research laboratory where scientists can run an entire experimental process from a computer in a remote location. Cloud laboratories offer the execution of life science research experiments as a service, allowing researchers to retain full control over experimental design. Users create experimental protocols through a high-level API and the experiment is executed in the cloud laboratory where users do not need to track its progression.

Cloud labs reduce variability in experimental execution, as the code can be interrogated, analyzed, and executed repeatedly. They also reduce costs by sharing capital costs across many users, by running experiments in parallel, and reducing instrument downtime. Finally, they facilitate collaboration by make it easier to share protocols, data, and data processing methods through the cloud.

Infrastructure
Cloud labs offer common scientific techniques including genotyping, nucleic acid synthesis, protein extraction, liquid transfer, plate reading, Western blotting, high-performance liquid chromatography, upstream bioprocessing, sequencing, and others. Users begin by signing up and logging in to the web-based software interface. Researchers submit their protocols via a dedicated web application or through an API and when the order arrives at the laboratory, human operators set up the experiment and transfer plates from machine to machine. Data is automatically uploaded to the cloud lab via an API where users can access and analyze it. Users can review controls, machine settings, and reagents used. Multiple experiments can be run in parallel, 24 hours a day.

Cloud labs are defined by five unique features:


 * 1) Users must be able to conduct experiments on-demand at any time from any location, all through a computer interface.
 * 2) The cloud laboratory must enable a user to digitally replicate the experience of standing in a traditional laboratory and manually operating instruments. It must allow users to specify all aspects of their experiments remotely without lead time, additional software, or outside experts
 * 3) Users must have on-demand access to all the instruments needed to perform their experiment, making a physical laboratory redundant and unnecessary.
 * 4) Users must be able to perform all aspects of sample preparation, storage, and handling from a remote setting.
 * 5) Users must be able to script and connect multiple experiments as well as process, analyze, visualize, and interpret data using a single standardized computer interface.

Using a Cloud Laboratory vs. High Throughput Experimentation
High-Throughput Experimentation involves increasing throughput by scaling up the number of experiments that can be run in parallel using a common sample form factor and technique. But when space or materials are limited, minor factors must be assigned to progressively smaller fractions to increase the number of replicants. Cloud labs, on the other hand, don't fundamentally scale up a single experiment but rather increase the number of types of experiments that can be run in parallel. For example, with a cloud lab, a scientist could simultaneously attempt dozens of different purification methods that each uses completely unique equipment sets.

HTE work cells can sometimes be accessed remotely to trigger a run on a library or digitally monitor a run. However, this remote monitoring or screen triggering does not impact the development that must take place in advance of a run. Often with HTE, scientists must group samples into libraries that use the same or very similar form factor containers such that the work cell can more easily traffic and address each sample in an integrated manner.

History
Cloud laboratories were built on advancements made in laboratory automation in the 1990s. In the early 1990s, the modularity project of the Consortium of Automated Analytical Laboratory Systems worked to define standards by which biotechnology manufacturers could produce products that could be integrated into automated systems. In 1996, the National Committee for Clinical Laboratory Standards (now the Clinical and Laboratory Standards Institute) proposed laboratory automation standards that aimed to enable consumers of laboratory technology to purchase hardware and software from different vendors and connect them to each other seamlessly. The Committee launched five subcommittees in 1997 and released standardization protocols to guide product development through the early 2000s.

These early developments in interoperability led to early examples of lab automation using cloud infrastructure, such as the Robot Scientist “Adam” in 2009. This robot encapsulated and connected all the laboratory equipment necessary to perform microbial batch experiments.

In 2010, D.J. Kleinbaum and Brian Frezza founded antiviral developer Emerald Therapeutics. To simplify laboratory testing, the group wrote centralized management software for their collection of scientific instruments and a database to store all metadata and results.

In 2012, Transcriptic founded a robotic cloud laboratory for on-demand scientific research, which performed select tasks including DNA cloning remotely.

In 2014, Emerald Therapeutics spun out the Emerald Cloud Lab to fully replace the need for a traditional lab environment, enabling scientists from around the world to perform all necessary activities, from experimental design to data acquisition and analysis.

Carnegie Mellon University's Mellon College of Science is building the world’s first academic cloud laboratory on their campus. The 20,000 square foot laboratory will be completed in 2023 and offer access to CMU researchers and eventually to other schools and life-sciences startups in Pittsburgh.