The changing nature of large scalable AI and ML models – By Aditya Abeysinghe Distributed AI Artificial Intelligence (AI) models are often costly to operate due to high Central Processing Unit (CPU) power required to process data. AI models also depend on additional constraints such as volatile memory, secondary memory and power. Most medium and large scale AI models are processed on cloud or remote data center servers which have high CPU power and memory compared to small scale processing devices due to these issues. However, sending data to a remote server is often not viable due to extra costs of data communication, costs of maintaining and monitoring servers in cloud and due to issues with privacy. Distributed AI is a method to distribute AI models outside a server to a location close to a data source. With distributed AI, AI models are deployed at either the source or at ...

Read More →