Working with Apache Spark

Description

In this Course, you will Learn in detail about Apache Spark and its Features. This is course deep dives into Features of Apache Spark, RDDs, Transformation, Actions, Lazy Execution, Data Frames, DataSets, Spark SQL, Spark Streaming, PySpark, Sparklyr and Spark Jobs.

You will explore creating Spark RDD and performing various transformation operations on RDDs along with actions. This Course also illustrates the difference between RDD, DataFrame and DataSet with examples. You will also explore features of Spark SQL and execute database queries using various contexts.

In this course, you will also explore Spark Streaming along with Kafka. The Spark Streaming examples includes producing and consuming messages on a Kafka Topic. Spark program is basically coded using Scala in this course, but PySpark is also discussed, programming examples using PySpark is also included.

Usage of Sparklyr package in R Programming is included in the Course. Finally, the course includes how to schedule and execute Spark Jobs.

Who this course is for:

  • Data Scientists / Data Engineers
  • Big Data Developers
  • Big Data Engineers
  • Big Data Architects
  • Any technical personnel who are interested in learning and Exploring the Features of Apache Spark

Requirements

  • Working Knowledge on Cloudera Hadoop Stack
  • Basic Programming Knowledge
  • Basic Linux Commands

Last Updated 9/2020

Download Links

Direct Download

Working with Apache Spark.zip (990.2 MB) | Mirror

Torrent Download

Working with Apache Spark.torrent (56 KB) | Mirror

Source : https://www.udemy.com/course/working-with-apache-spark/

Leave a Reply

Your email address will not be published. Required fields are marked *