modeldb
modeldb copied to clipboard
VR-7870 Support passing a dataframe in the client predict method
I'm thinking of sth like this, in order to allow the user to define their model in terms of data frame (modulo some wrapper). Need to actually verify that it works 😬 But let me know if sth like this would be problematic, design-wise 🤔
I think it would be important to maintain backwards compatibility (users who are passing a DataFrame
to predict()
, but are handling it as a plain array of values). @conradoverta What do you think?
I think it would be important to maintain backwards compatibility (users who are passing a
DataFrame
topredict()
, but are handling it as a plain array of values). @conradoverta What do you think?
We need to ship this asap. I'm ok breaking compatibility on this and adding a note.
Make sure that we provide the conversion back too.
@conradoverta what do you mean by conversion back 🤔
Make sure that we provide the conversion back too.
@conradoverta what do you mean by conversion back 🤔
How do I get my dataframe back inside my prediction model?
Make sure that we provide the conversion back too.
@conradoverta what do you mean by conversion back 🤔
How do I get my dataframe back inside my prediction model?
@conradoverta I think what you meant is:
class MyModel(object):
@prediction_input_df
def predict(self, input):
# input here should be a dataframe
right? It's handled by the prediction_input_df
decorator, which, receiving the dictionary, will convert it back to a data frame and feed it to the predict method. This is analogous to other helper decorators that I found in DeployedModel.
@conradoverta
Make sure that we provide the conversion back too.
@conradoverta what do you mean by conversion back 🤔
How do I get my dataframe back inside my prediction model?
Did Jim's reply address this?
We're revisiting model I/O very, very soon. And that may come with changes in how the client handles predictions.
And I'm under the impression that users already find these sorts of decorators rather confusing 😬