Testing Spark Structured Streaming Applications:

photo credit: https://www.flickr.com/photos/cdevers/
def loadScenario[T<: Message : ClassTag](file: String): Seq[T] = {
val fileString = Source.fromFile(file).mkString
val parsed = mapper.readValue(fileString, classOf[Scenario])
parsed.input.map { data =>
val json = mapper.writeValueAsString(data)
convert[T](json)
}
}
case class MockKafkaDataFrame(key: Array[Byte], value: Array[Byte])

Software Architect at Twilio in California. I work on real-time, ridiculously large data (*Big Data) applications with Scala and Spark. Voice Insights. ☎️