Funded by IBM, a group of researchers at the National Science Foundation (NSF) in Virginia have devised a new way to enhance computing efficiencies by the means of utilizing management tools for cloud-based light-weight virtual machine replacements. Termed as containers, these frameworks allows the microservices, which are responsible for data retrieval from the ether, to deploy their methods in a more active manner.
The study is due to be demonstrated at the FAST’18 in Oakland in California later this month.
The researchers have concentrated on the fact that containers, unlike software-heavy virtual machines, can share the code of pertaining operating system, and provide for a much quicker deployment of software programs while maintaining strong performance. This IBM study has the potential of offer large-scale assessment of the commonly used container management framework, frequently termed as Docker, which enables the deployment of microservices with a way of registry service that can act as a central repository for software components that focus on particular operations called images.
For over a period of 75 days, the team gauged a humongous among of data that was generated from give data centers distributed geographically. 38 million requests including 181.3 TB of traces or timestamped logs were observed that document a program’s execution. It was detected that container technology can utilize caching and perfetching of information, which can be highly important in terms of reducing latency.
The collaborator of the study, Mohamed Mohamed, says that advancements in container technology has shown the ability to radically improve the cloud-computing performance, providing an insight into application compliance, security, and performance.