Problem
Given an array of integers, find out whether there are two distinct indices i and j in the array such that the absolute difference between nums[i] and nums[j] is at most t and the absolute difference between i and j is at most k.
1 | Example 1: |
Solution
Analysis
The first idea came in my mind is the bruce method.Computing every candidate in nums[i] and nums[j], i and j. If the condition is met, return true; if every candidate is false, then return false. But the shortcoming of this solution is that the time-consuming is very high, Can’t pass all test cases.
After analysis, two indices i and j, the condition is : |nums[i] – nums[j]|<= t && | i – j | <=k. We can maintain a window that does not exceed size k. According the size, the condition about index is met. So the main problem is the value limit. The current value nums[i], a value in the window x, then, -t <= x-nums[i] <= t, nums[i]-t <= x <= nums[i] + t. So we can judge the result by finding whether there is a element in the window meeting the condition.
Codes
1 | typedef long long LL; |