Hil Liao
Hil Liao
### Expected behavior `kpt live apply` command returns error when the infrastructure as code change can't get applied in the namespace ### Actual behavior `kpt live apply` command gets stuck...
I observe the [google_cloud_pipeline_components.v1.custom_job.CustomTrainingJobOp](https://google-cloud-pipeline-components.readthedocs.io/en/google-cloud-pipeline-components-0.2.1/google_cloud_pipeline_components.experimental.custom_job.html) may not have a parameter to accept a script path like the following [code section](https://github.com/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/official/custom/sdk-custom-image-classification-online.ipynb): ``` job = aiplatform.CustomTrainingJob( display_name=JOB_NAME, script_path="task.py", container_uri=TRAIN_IMAGE, requirements=["tensorflow_datasets==1.3.0"], model_serving_container_image_uri=DEPLOY_IMAGE, ) MODEL_DISPLAY_NAME...
- [constraint.yaml](https://github.com/open-policy-agent/gatekeeper-library/blob/master/library/general/httpsonly/samples/ingress-https-only/constraint.yaml) has been moved to a sub-directory in the commands at https://github.com/open-policy-agent/gatekeeper-library#how-to-use-the-library ; please update the Readme - can I contribute a section of the README to use [GKE...
Python error encountered executing the following line at [Extract train and eval splits]: [sql_query = datasource_utils.get_training_source_query(](https://github.com/GoogleCloudPlatform/mlops-with-vertex-ai/blame/main/03-training-formalization.ipynb#L270) ``` sql_query = datasource_utils.get_training_source_query( PROJECT, REGION, DATASET_DISPLAY_NAME, ml_use='UNASSIGNED', limit=5000) ``` Observed error: --------------------------------------------------------------------------- IndexError...
The latest Python 3 package markupsafe is not compatible with [from tfx.orchestration.experimental.interactive.interactive_context import InteractiveContext](https://github.com/GoogleCloudPlatform/mlops-with-vertex-ai/blame/main/03-training-formalization.ipynb#L181): ``` import ml_metadata as mlmd from ml_metadata.proto import metadata_store_pb2 from tfx.orchestration.experimental.interactive.interactive_context import InteractiveContext ``` Error: --------------------------------------------------------...
Users like to have a way to create cloud build triggers in [the Terraform folder](https://github.com/GoogleCloudPlatform/mlops-with-vertex-ai/tree/main/provision/terraform) or with gcloud commands. I found tensorflow transform is using Apache beam runner in the...
after executing the command `sudo make install`, the kernel module isn't loaded. I manually loaded it with `sudo modprobe -v 8723du` and it gets automatically loaded after reboot. Another issue...
Enhance the use case to integrate with Google Cloud logging and storage in the post processing of the image frame method. I want to have this reviewed as a draft....
1. The [Scio](https://github.com/spotify/dbeam/blob/master/scio) broke in the Readme content 2. The Examples can't run easily. The following command after I changed CLASS_PATH dbeam-core_2.12.jar to dbeam/target/scala-2.12/*.jar, I got error Error: Unable to...
Other folders have Dockerfile for the Cloud run service. [Procfile](https://github.com/GoogleCloudPlatform/python-docs-samples/blob/9e5fec93d48b041d57420998481b9ab4eef8d987/eventarc/storage_handler/Procfile#L1) should not be here. Expect a Dockerfile.