You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
### What changes were proposed in this pull request?
We retire pypy
* Remove all pypy related code in pyspark (actually the only mattered one is for simple traceback so it probably will still work)
* Remove all pypy skips for tests
* Remove master CI for pypy. **branch-4.0 and branch-4.1 tests are kept**
* Remove pypy 3.11 docker image (3.10 is kept for testing)
* Remove pypy from docs (we should probably do it for the actual spark website too)
### Why are the changes needed?
We had a discussion in https://lists.apache.org/thread/glcq0zgr33sozo7y4y7jqph24yh3m92p about dropping support for pypy and we have many +1s and no -1s.
`numpy` dropped support for pypy and pypy is not really in active development.
### Does this PR introduce _any_ user-facing change?
Yes, we don't officially support pypy anymore. We still expect most of the old pypy code to work but we should not make any promises.
### How was this patch tested?
CI.
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes#54988 from gaogaotiantian/retire-pypy.
Authored-by: Tian Gao <gaogaotiantian@hotmail.com>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
Copy file name to clipboardExpand all lines: docs/rdd-programming-guide.md
+1-2Lines changed: 1 addition & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -40,7 +40,7 @@ along with if you launch Spark's interactive shell -- either `bin/spark-shell` f
40
40
<divdata-lang="python"markdown="1">
41
41
42
42
Spark {{site.SPARK_VERSION}} works with Python 3.10+. It can use the standard CPython interpreter,
43
-
so C libraries like NumPy can be used. It also works with PyPy 7.3.6+.
43
+
so C libraries like NumPy can be used.
44
44
45
45
Spark applications in Python can either be run with the `bin/spark-submit` script which includes Spark at runtime, or by including it in your setup.py as:
46
46
@@ -71,7 +71,6 @@ you can specify which version of Python you want to use by `PYSPARK_PYTHON`, for
0 commit comments