sqlflow
sqlflow copied to clipboard
Brings SQL and AI together.
In the `TO RUN` statement, the first parameter after `CMD` keyword is the absolute path of a Python program or an executable in the docker image. Please check the following...
Currently, we use `IDENT` as the param of `USING` in prediction clause. In order to support using model in Model Zoo, we changed the `IDENT` to allow characters like [/_:.]....
**Is your feature request related to a problem? Please describe.** As described in #2656, we also need to migrate Alisa submitter to Python. **Describe the solution you'd like** Rewrite Alisa...
As the discussion of #2494 , SQLFlow would be made a pure compiler. This requires we support feature derivation/verifier and columns in Python. There's a reference implementation in https://github.com/sql-machine-learning/sqlflow/pull/2477/files#diff-606462e9e98c6c23cdf010f1ec243c0e ....
### Problem SQLFlow COLUMN clause describes the process of transforming the data from source table into the model input. The source table may contain hundreds of or even more columns...
### Refactor - [ ] Port the existed feature derivation functionality into the new infrastructure. #2568 ### Cover multiple frameworks - [ ] PyTorch: New COLUMN clause syntax can describe...
# Rethinking SQLFlow IR The SQLFlow compiler generates Argo workflow, which is a `.YAML` file. Each workflow step call SQLFlow `runtime` library to submit an AI job or SQL job....
About how to call entry file in PAI. **Background** On submitting job to PAI, we need to pack some resource into a tarball, then push the tarball to PAI, finally...
SQLFlow is a compiler that compiles the SQL statements into an Argo workflow YAML. However, the feature derivation stage must be done during runtime instead of during compiling Argo YAML....