[FLINK-35180][python] Fix embedded (thread-mode) type converters#27927
Draft
polsinello wants to merge 1 commit intoapache:release-1.19from
Draft
[FLINK-35180][python] Fix embedded (thread-mode) type converters#27927polsinello wants to merge 1 commit intoapache:release-1.19from
polsinello wants to merge 1 commit intoapache:release-1.19from
Conversation
…temporal, numeric, and nested types Pemja auto-converts Python datetime/date/time to java.sql.Timestamp/Date/Time, but Flink's ExternalSerializer and DataFormatConverters.RowConverter expect the modern java.time.* bridge classes (LocalDateTime, LocalDate, LocalTime, Instant). In thread mode this mismatch causes ClassCastException at serialization boundaries. Process mode is unaffected because its Beam-based runnerOutputTypeSerializer resolves to the correct java.time.* serializers via LegacyTypeInfoDataTypeConverter. This patch adds bridge-aware DataConverters for all temporal types on both the Table API and DataStream paths, replaces IdentityDataConverter where it silently passed java.sql.* through, fixes row/tuple buffer reuse that caused silent data corruption in ARRAY<ROW<>>, widens numeric DataConverter generics from Long/Double to Number to avoid bridge-method ClassCastException, and caches the original Java ExternalTypeInfo to prevent lossy DECIMAL precision round-trips through legacy TypeInfo conversion.
Collaborator
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What is the purpose of the change
In embedded (thread) mode, Pemja's JNI bridge auto-converts Python
datetime/date/timeobjects tojava.sql.Timestamp/Date/Time, but Flink'sExternalSerializerandDataFormatConverters.RowConverterexpect the modernjava.time.*bridge classes (LocalDateTime,LocalDate,LocalTime,Instant). This mismatch causesClassCastExceptionat serialization boundaries for any UDF that returns or receives temporal types. Process mode is unaffected because its Beam-basedrunnerOutputTypeSerializerresolves to the correctjava.time.*serializers viaLegacyTypeInfoDataTypeConverter.This patch fixes several related type-conversion bugs in the embedded Python execution path so that all Flink-supported types work correctly in thread mode.
Brief change log
DataConverterimplementations for all temporal types (TIMESTAMP,DATE,TIME,TIMESTAMP_LTZ) on both the Table API (PythonTypeUtilsinflink-table-runtime) and DataStream (PythonTypeUtilsinflink-streaming-java) pathsIdentityDataConverterfor temporal types where it silently passedjava.sql.*objects through without conversionROW/TUPLEbuffer reuse inArrayDataConverterthat caused silent data corruption inARRAY<ROW<...>>resultsDataConvertergenerics fromLong/DoubletoNumberto avoid bridge-methodClassCastExceptionwhen Pemja returnsInteger/Floatfor small valuesExternalTypeInfoin the PythonExternalTypeInfowrapper to prevent lossyDECIMALprecision round-trips through legacyTypeInfoconversionVerifying this change
This change is already covered by existing tests — the existing PyFlink Table API and DataStream test suites exercise all affected type paths.
Does this pull request potentially affect one of the following parts:
@Public(Evolving): nojava.time.*factory call per value).Documentation