apache/spark-docker

Spark is a unified analytics engine for large-scale data processing. It provides high-level APIs in Scala, Java, Python, and R, and an optimized engine that supports general computation graphs for data analysis. It also supports a rich set of higher-level tools including Spark SQL for SQL and DataFrames, pandas API on Spark for pandas workloads, MLlib for machine learning, GraphX for graph processing, and Structured for stream processing.

https://spark.apache.org/

You can see 3.4.0 PR as reference.

  • 1.1 Add gpg key to tools/template.py

    This gpg key will be used by Dockerfiles (such as 3.4.0) to verify the signature of the Apache Spark tarball.

  • 1.2 Add image build workflow (such as 3.4.0 yaml)

    This file will be used by Actions to build the Docker image when you submit the PR to make sure dockerfiles are correct and pass all tests (build/standalone/kubernetes).

  • 1.3 Using ./add-dockerfiles.sh [version] to add Dockerfiles.

    You will get a new directory with the Dockerfiles for the specified version.

  • 1.4 Add version and tag info to versions.json, publish.yml and test.yml.

    This version file will be used by image build workflow (such as 3.4.0 reference) and docker official image.

Click Publish (Java 21 only), Publish (Java 17 only) (such as 4.x) or Publish (such as 3.x) to publish images.

After this, the apache/spark docker images will be published.

Submit the PR to docker-library/official-images, see (link)[docker-library/official-images#15363] as reference.

You can type tools/manifest.py manifest to generate the content.

After this, the spark docker images will be published.

Apache Spark ImageSpark Docker Official Image
Nameapache/sparkspark
MaintenanceReviewed, published by Apache Spark communityReviewed, published and maintained by Docker community
Update policyOnly build and push once when specific version releaseActively rebuild for updates and security fixes
Linkhttps://hub.docker.com/r/apache/sparkhttps://hub.docker.com/_/spark
sourceapache/spark-dockerapache/spark-docker and docker-library/official-images

We recommend using Spark Docker Official Image, the Apache Spark Image are provided in case of delays in the review process by Docker community.

This repository contains the Dockerfiles used to build the Apache Spark Docker Image.

See more in SPARK-40513: SPIP: Support Docker Official Image for Spark.

About

Official Dockerfile for Apache Spark

Topics

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages