Company: Google big code_21march
Difficulty: medium
Server Load Balancer Stability A cloud infrastructure company monitors N servers in a data center, where each server has a real-time load metric recorded as a sequence of N measurements A[0], A[1], ..., A[N-1] taken at consecutive time intervals. The operations team needs to partition this timeline into at most K contiguous monitoring windows to analyze system stability. For each monitoring window (a contiguous subsequence of measurements), the stability score is defined as the difference between the maximum and minimum load values within that window. A window with smaller difference indicates more stable performance. The overall system instability is defined as the maximum stability score across all K windows. Your task is to partition the N measurements into at most K contiguous windows such that the overall system instability (the maximum of all window stability scores) is minimized. You need to find this minimum possible maximum stability score. If K is greater than or equal to N,