Machine Learning Based Performance Managment & Capacity Planning
What we do differently
  1. ML Based Performance & Capacity Modeling
    Performance & capacity modeling is paramount to analyze, evaluate, and quantify the impact of IT architectural & workload changes. Systems & Application modeling provides insight into the achievable performance on current systems. Systems & Application modeling allows the exploration of expected performance improvements possible on future systems. Systems & Application modeling provides high flexibility and substantial cost savings as no physical test environment has to be build and maintained.
  2. ML Based Speedup & Scalability Studies
    ML Based Speedup & Scalability Studies
    Capacity Study -> Determine Headroom. Capture System Behavior under Increased Workload Conditions. The Physical Setup of the Environment is unchanged. Speedup Study -> For a Fixed Workload, evaluate the Systems Behavior while adding Physical Resources (Application Parallelism is Paramount) Scalability Study -> Evaluate the System Behavior while (1) Increasing the Workload and (2) adding Physical Resources
 
  
Technological Fusion
   
We at DHT provide machine learning based IT systems performance and capacity evaluations. We model the HW, the OS, as well as the application stack to quantify and optimize the status quo as well as any future workload requirements. Further, we use the models to assess application performance and scalability while changing either the OS or the HW infrastructure. We also use the models to quantify the potential of consolidating server systems and/or workloads. We do this for server systems, clusters, the Cloud, supercomputers, as well as any Big Data environment. As IT installations continue to grow, executing actual empirical studies while acquiring new HW or assessing the impact of workload changes is just not feasible anymore. Hence, a modeling based approach represents basically the only cost effective approach to address performance, capacity, and sizing related IT projects.