This deal could be the largest ever artificial intelligence (AI) to data centre application ever as CBRE Data Center Solutions gets ready to deploy hundreds of LitBit’s AI maintenance systems across it centers.
Based in San Jose, California, LitBit is a company that focuses on providing next generation converged infrastructure solutions for upcoming markets. It was founded by Jean Paul Balajadia and Scott Noteboom back in 2013.
The company aims to use its AI technology to monitor data centre’s environmental conditions and infrastructure to look for anolmolies a human data center manager might not see. The idea is to be proactive and look for issues before they have they chance to cause major problems.
In order to train its machine learning AI model, REMI (Risk Exposure Mitigation Intelligence) LitBit used human experts as well as existing and historical data. This will ensure it’s as accurate and effective as possible.
CBRE is a Los-Angeles based real estate services company that manages more than 800 data centers worldwide on behalf of various clients. With all the years of knowledge and expertise it has under its belt and all the data it has access to, CBRE said, it will be able to create “the world’s largest actionable AI repository of machine operating data.”
Currently, Google’s DeepMind AI technology is the largest application of AI machine learning that’s been employed in data centers to improve efficiency. And while that’s obviously a huge operation, it’s being operated by one single end user. CBRE on the other hand works a little differently to that.
CBRE manages a data centre fleet for several enterprise clients including banks and insurance firms. They have nearly every kind of model imaginable under their belts and for that reason alone have the potential to take on Google’s DeepMind. But, they still have a way to go before they do win the pole position.
Creating the dataset will not be easy. Firstly because older facilities tend to be less instrumented than modern centres. Another problem is that CBRE’s data centre’s have such diverse equipment that creating a dataset with data clean enough to be used for training, may be difficult. And lastly, not all companies using these facilities will be on board with loading their operational data into one, big, central repository for concerns over competition, security, and compliance.