Skip to content

Commit d2989a9

Browse files
committed
[SPARK-56186][PYTHON][INFRA][FOLLOWUP] Cleanup remaining PyPy related code
### What changes were proposed in this pull request? This PR removes remaining PyPy related code. ### Why are the changes needed? It's almost done in #54988 but there are still a few left and suggested to remove them #55369 (comment) ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? GA. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #55505 from sarutak/remove-pypy. Authored-by: Kousuke Saruta <sarutak@amazon.co.jp> Signed-off-by: Kousuke Saruta <sarutak@apache.org>
1 parent 174fc60 commit d2989a9

3 files changed

Lines changed: 0 additions & 97 deletions

File tree

.github/workflows/build_infra_images_cache.yml

Lines changed: 0 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,6 @@ on:
3232
- 'dev/spark-test-image/sparkr/Dockerfile'
3333
- 'dev/spark-test-image/python-minimum/Dockerfile'
3434
- 'dev/spark-test-image/python-ps-minimum/Dockerfile'
35-
- 'dev/spark-test-image/pypy-310/Dockerfile'
3635
- 'dev/spark-test-image/python-310/Dockerfile'
3736
- 'dev/spark-test-image/python-311/Dockerfile'
3837
- 'dev/spark-test-image/python-312/Dockerfile'
@@ -140,19 +139,6 @@ jobs:
140139
- name: Image digest (PySpark PS with old dependencies)
141140
if: hashFiles('dev/spark-test-image/python-ps-minimum/Dockerfile') != ''
142141
run: echo ${{ steps.docker_build_pyspark_python_ps_minimum.outputs.digest }}
143-
- name: Build and push (PySpark with PyPy 3.10)
144-
if: hashFiles('dev/spark-test-image/pypy-310/Dockerfile') != ''
145-
id: docker_build_pyspark_pypy_310
146-
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8
147-
with:
148-
context: ./dev/spark-test-image/pypy-310/
149-
push: true
150-
tags: ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-pypy-310-cache:${{ github.ref_name }}-static
151-
cache-from: type=registry,ref=ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-pypy-310-cache:${{ github.ref_name }}
152-
cache-to: type=registry,ref=ghcr.io/apache/spark/apache-spark-github-action-image-pyspark-pypy-310-cache:${{ github.ref_name }},mode=max
153-
- name: Image digest (PySpark with PyPy 3.10)
154-
if: hashFiles('dev/spark-test-image/pypy-310/Dockerfile') != ''
155-
run: echo ${{ steps.docker_build_pyspark_pypy_310.outputs.digest }}
156142
- name: Build and push (PySpark with Python 3.10)
157143
if: hashFiles('dev/spark-test-image/python-310/Dockerfile') != ''
158144
id: docker_build_pyspark_python_310

dev/infra/Dockerfile

Lines changed: 0 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -90,18 +90,6 @@ RUN Rscript -e "install.packages(c('remotes', 'knitr', 'markdown', \
9090
# See more in SPARK-39735
9191
ENV R_LIBS_SITE="/usr/local/lib/R/site-library:${R_LIBS_SITE}:/usr/lib/R/library"
9292

93-
94-
RUN add-apt-repository ppa:pypy/ppa
95-
RUN mkdir -p /usr/local/pypy/pypy3.10 && \
96-
curl -sqL https://downloads.python.org/pypy/pypy3.10-v7.3.17-linux64.tar.bz2 | tar xjf - -C /usr/local/pypy/pypy3.10 --strip-components=1 && \
97-
ln -sf /usr/local/pypy/pypy3.10/bin/pypy /usr/local/bin/pypy3.10 && \
98-
ln -sf /usr/local/pypy/pypy3.10/bin/pypy /usr/local/bin/pypy3
99-
RUN curl -sS https://bootstrap.pypa.io/get-pip.py | pypy3
100-
RUN echo 'meson<1.11.0' > /tmp/constraints.txt && \
101-
PIP_CONSTRAINT=/tmp/constraints.txt pypy3 -m pip install numpy 'six==1.16.0' 'pandas==2.3.3' scipy coverage matplotlib 'lxml==4.9.4' && \
102-
rm -f /tmp/constraints.txt
103-
104-
10593
ARG BASIC_PIP_PKGS="numpy pyarrow>=18.0.0 six==1.16.0 pandas==2.3.3 scipy plotly>=4.8 mlflow>=2.8.1 coverage matplotlib openpyxl memory-profiler>=0.61.0 scikit-learn>=1.3.2"
10694
# Python deps for Spark Connect
10795
ARG CONNECT_PIP_PKGS="grpcio==1.76.0 grpcio-status==1.76.0 protobuf==6.33.5 googleapis-common-protos==1.71.0 graphviz==0.20.3"

dev/spark-test-image/pypy-310/Dockerfile

Lines changed: 0 additions & 71 deletions
This file was deleted.

0 commit comments

Comments
 (0)