Body Labs

Body Labs is a Manhattan-based software company founded in 2013. Body Labs is a software provider of human-aware artificial intelligence that understands the 3D body shape and motion of people from RGB photos or videos.

In October 2017, the company was acquired by Amazon.

History
Body Labs was founded by Michael J. Black, William J. O'Farrell, Eric Rachlin, and Alex Weiss who were connected at Brown University and Max Planck Institute for Intelligent Systems.

In 2002, Black was researching how to create a statistical model of the human body. While Black was teaching a course on computer vision at Brown University, the Virginia State Police contacted him about a robbery and murder at a 7-Eleven. The police wanted to use computer vision to identify the suspect in a surveillance video. By creating a statistical model, Black's group could vindicate some of the evidence in the case like confirming the suspect's height.

On November 13, 2014, Body Labs announced $2.2 million in Seed funding led by FirstMark Capital, with additional investors including New York Angels and existing investors.

On November 3, 2015, Body Labs announced $11 million in Series A funding led by Intel Capital, with additional investors including FirstMark Capital, Max-Planck-Innovation GmbH, Osage University Partners, Catalus Capital and the company founders.

BodyKit
On March 3, 2015, Body Labs launched BodyKit, a collection of API’s and embeddable components for integrating the human body into apps and tools.

Body Labs Blue
On July 20, 2016, Body Labs launched Body Labs Blue, an API and embeddable Web interface that takes physical measurements and predicts additional digital measurements to help with custom clothing creation.

Body Labs Red
On October 5, 2016, Body Labs launched Body Labs Red, an API for automatically processing 3D scans into a full 3D body model. Additionally, Body Labs announced a partnership with 3dMD to process their 3D scans.

Mosh Mobile App
On Feb. 15, 2017, Body Labs released Mosh on the App Store, an Apple iOS app, the predicts the 3D human pose and shape of a subject and renders 3D effects on them.

SOMA: Human-Aware AI
On June 1, 2017, Body Labs launched SOMA, software that uses artificial intelligence to predict 3D human shape and motion from RGB photos or video.

On July 21, 2017, Body Labs launched SOMA Shape API for 3D model and Measurement Prediction. Shape API allows third party apps to easily connect to the SOMA backend.