Data aggregation is an important topic to understand with Live Optics as it is a key fundamental of the program and provides statistical analysis of compute utility in IT environments.
Data aggregation is any process in which information is gathered and combined for multiple systems and expressed in a summary form, for purposes such as statistical analysis.
Live Optics profile assessments provide data aggregation and analysis of performance metrics of servers within a compute environment. They also provides statistical data to assist suppliers and consumers of technology with an understanding of their compute utility.
Previous methods of understanding compute usage within an IT environment consisted of measuring the “high water” marks for all of the systems and totaling the amount. As the industry moved to consolidated resources such as cloud or virtualization this proved to be a misleading practice that lead to overcompensation.
Consolidated resource environments must take into account the “spikey” nature or workloads and what happens when those patterns are blended or “aggregated” together. Live Optics' aggregation algorithms can more accurately define compute utility and drastically reduce cost associated with acquiring technology. Aggregated data is displayed as both summary and in graphical format. It is the default view of all projects.
These aggregate values are now a hardware/platform independent perspective of compute utility that must be solved for when upgrading or migrating the environment.
Related To:
IOPS
Throughput (Storage)
Latency
Performance
Network Throughput
Live optics Viewer
Efficiency
Disk Controller