Convert spark df to splink df #2194
-
I have a use case where i want to read my predictions from parquets and then filter it and apply clustering on it, hence i need to convert the spark df to splink df, is this supported? |
Beta Was this translation helpful? Give feedback.
Answered by
aymonwuolanne
May 22, 2024
Replies: 1 comment
-
Yep, you can use |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
guylissak
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Yep, you can use
df_predict = linker.register_table(<your_spark_df>, "predictions")
- see the documentation here