Tag: HDFS

Google Cloud: Creating Dataproc Cluster Using Google Cloud and Running a Pyspark Job

Description This hands-on lab introduces how to use Google Cloud Storage as the primary input and output location for Dataproc cluster jobs. Leveraging GCS over the Hadoop Distributed File System (HDFS) allows us to treat clusters as ephemeral entities, so we can delete clusters that are no longer in use, while still preserving our data. […]

Back To Top

Contact Us