What happens when you give the batch size as 2 and sliding window as 1 in spark?
Answers
Answered by
3
Answer:
What that means is that streaming data is divided into batches based on time slice called batch interval. As a simple example, lets say batch interval is 10 seconds and we need to know what happened in last 60 seconds every 30 seconds. Here 60 seconds is called window length and 30 second slide interval.
___________❤️
Answered by
1
Answer:
Spark streaming leverages advantage of windowed computations in spark. It offers to apply transformations over a sliding window of data. The figure mentioned below explains this sliding window. As window slides over a source DStream, the source RDDs that fall within the window are combined.
hope it helps you
Similar questions