Add `vertex_ai_model_resource` :copy
As the vertex_ai_models work continues to get larger, rather than treat it as one massive PR the goal is to treat each section of what comes with creating an ai model in google cloud. One of them being :copy and the other being :upload which is worked on more in this PR #9767 which will take much more time due to the different variables that come with creating a model (Tabular Models, image recognition models, speech models, etc.)
With :copy being completed with the ability to create and delete copied models within a project, what's left is creating tests.
Simplest tfconfig to copy an already existing model in your project
resource "google_vertex_ai_models" "copy" {
project = "hc-17ccbe3c280c434a87e301c7c95"
source_model = "projects/hc-17ccbe3c280c434a87e301c7c95/locations/us-central1/models/5460298988349554688"
region = "us-central1"
}
Release Note Template for Downstream PRs (will be copied)
vertex_ai: added `vertex_ai_model_resource`
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google provider: Diff ( 5 files changed, 3092 insertions(+), 2 deletions(-))
google-beta provider: Diff ( 5 files changed, 3092 insertions(+), 2 deletions(-))
terraform-google-conversion: Diff ( 1 file changed, 799 insertions(+))
Open in Cloud Shell: Diff ( 4 files changed, 114 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models (1 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
artifact_uri = # value needed
container_spec {
args = # value needed
command = # value needed
deployment_timeout = # value needed
env {
name = # value needed
value = # value needed
}
grpc_ports {
container_port = # value needed
}
health_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
health_route = # value needed
image_uri = # value needed
ports {
container_port = # value needed
}
predict_route = # value needed
shared_memory_size_mb = # value needed
startup_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
}
metadata {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
metadata_schema_uri = # value needed
model = # value needed
model_id = # value needed
parent_model = # value needed
pipeline_job = # value needed
predict_schemata {
instance_schema_uri = # value needed
parameters_schema_uri = # value needed
prediction_schema_uri = # value needed
}
source_model = # value needed
supported_export_formats {
exportable_content = # value needed
}
version_aliases = # value needed
}
Tests analytics
Total tests: 47
Passed tests: 46
Skipped tests: 0
Affected tests: 1
Click here to see the affected service packages
- vertexai
Action taken
Found 1 affected test(s) by replaying old test recordings. Starting RECORDING based on the most recent commit. Click here to see the affected tests
TestAccVertexAIModels_vertexAiModelBasicExample
$\textcolor{red}{\textsf{Tests failed during RECORDING mode:}}$
TestAccVertexAIModels_vertexAiModelBasicExample[Error message] [Debug log]
$\textcolor{red}{\textsf{Please fix these to complete your PR.}}$ View the build log or the debug log for each test
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google provider: Diff ( 5 files changed, 3093 insertions(+), 2 deletions(-))
google-beta provider: Diff ( 5 files changed, 3093 insertions(+), 2 deletions(-))
terraform-google-conversion: Diff ( 1 file changed, 799 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models (1 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
artifact_uri = # value needed
container_spec {
args = # value needed
command = # value needed
deployment_timeout = # value needed
env {
name = # value needed
value = # value needed
}
grpc_ports {
container_port = # value needed
}
health_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
health_route = # value needed
image_uri = # value needed
ports {
container_port = # value needed
}
predict_route = # value needed
shared_memory_size_mb = # value needed
startup_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
}
metadata {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
metadata_schema_uri = # value needed
model = # value needed
model_id = # value needed
parent_model = # value needed
pipeline_job = # value needed
predict_schemata {
instance_schema_uri = # value needed
parameters_schema_uri = # value needed
prediction_schema_uri = # value needed
}
supported_export_formats {
exportable_content = # value needed
}
version_aliases = # value needed
}
Tests analytics
Total tests: 47
Passed tests: 46
Skipped tests: 0
Affected tests: 1
Click here to see the affected service packages
- vertexai
Action taken
Found 1 affected test(s) by replaying old test recordings. Starting RECORDING based on the most recent commit. Click here to see the affected tests
TestAccVertexAIModels_vertexAiModelSourceBasicExample
$\textcolor{red}{\textsf{Tests failed during RECORDING mode:}}$
TestAccVertexAIModels_vertexAiModelSourceBasicExample[Error message] [Debug log]
$\textcolor{red}{\textsf{Please fix these to complete your PR.}}$ View the build log or the debug log for each test
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google provider: Diff ( 5 files changed, 3077 insertions(+), 2 deletions(-))
google-beta provider: Diff ( 5 files changed, 3077 insertions(+), 2 deletions(-))
terraform-google-conversion: Diff ( 1 file changed, 799 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models (1 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
artifact_uri = # value needed
container_spec {
args = # value needed
command = # value needed
deployment_timeout = # value needed
env {
name = # value needed
value = # value needed
}
grpc_ports {
container_port = # value needed
}
health_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
health_route = # value needed
image_uri = # value needed
ports {
container_port = # value needed
}
predict_route = # value needed
shared_memory_size_mb = # value needed
startup_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
}
description = # value needed
encryption_spec {
kms_key_name = # value needed
}
labels = # value needed
metadata {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
metadata_schema_uri = # value needed
model = # value needed
model_id = # value needed
parent_model = # value needed
pipeline_job = # value needed
predict_schemata {
instance_schema_uri = # value needed
parameters_schema_uri = # value needed
prediction_schema_uri = # value needed
}
supported_export_formats {
exportable_content = # value needed
}
version_aliases = # value needed
}
Tests analytics
Total tests: 47
Passed tests: 46
Skipped tests: 0
Affected tests: 1
Click here to see the affected service packages
- vertexai
Action taken
Found 1 affected test(s) by replaying old test recordings. Starting RECORDING based on the most recent commit. Click here to see the affected tests
TestAccVertexAIModels_vertexAiModelSourceBasicExample
$\textcolor{red}{\textsf{Tests failed during RECORDING mode:}}$
TestAccVertexAIModels_vertexAiModelSourceBasicExample[Error message] [Debug log]
$\textcolor{red}{\textsf{Please fix these to complete your PR.}}$ View the build log or the debug log for each test
Hey! I'm closing this PR as a part of a cleanup of older inactive PRs, using a threshold of PRs last updated over 3 months ago. This doesn't represent rejection of the change, and feel free to comment for me to reopen it if you plan to pick it back up, or feel free to start a new PR with the same changes in the future.
@c2thorn can the script that converts Ruby code to Go code be applied here?
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google provider: Diff ( 5 files changed, 3046 insertions(+), 2 deletions(-))
google-beta provider: Diff ( 5 files changed, 3046 insertions(+), 2 deletions(-))
terraform-google-conversion: Diff ( 1 file changed, 799 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models (1 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
artifact_uri = # value needed
container_spec {
args = # value needed
command = # value needed
deployment_timeout = # value needed
env {
name = # value needed
value = # value needed
}
grpc_ports {
container_port = # value needed
}
health_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
health_route = # value needed
image_uri = # value needed
ports {
container_port = # value needed
}
predict_route = # value needed
shared_memory_size_mb = # value needed
startup_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
}
description = # value needed
encryption_spec {
kms_key_name = # value needed
}
labels = # value needed
metadata {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
metadata_schema_uri = # value needed
model = # value needed
model_id = # value needed
parent_model = # value needed
pipeline_job = # value needed
predict_schemata {
instance_schema_uri = # value needed
parameters_schema_uri = # value needed
prediction_schema_uri = # value needed
}
supported_export_formats {
exportable_content = # value needed
}
version_aliases = # value needed
}
Tests analytics
Total tests: 54 Passed tests: 50 Skipped tests: 3 Affected tests: 1
Click here to see the affected service packages
- vertexai
Action taken
Found 1 affected test(s) by replaying old test recordings. Starting RECORDING based on the most recent commit. Click here to see the affected tests
- TestAccVertexAIModels_vertexAiModelSourceBasicExample
🔴 Tests failed during RECORDING mode:
TestAccVertexAIModels_vertexAiModelSourceBasicExample[Error message] [Debug log]
🔴 Errors occurred during RECORDING mode. Please fix them to complete your PR.
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google provider: Diff ( 5 files changed, 3046 insertions(+), 2 deletions(-))
google-beta provider: Diff ( 5 files changed, 3046 insertions(+), 2 deletions(-))
terraform-google-conversion: Diff ( 1 file changed, 799 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models (1 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
artifact_uri = # value needed
container_spec {
args = # value needed
command = # value needed
deployment_timeout = # value needed
env {
name = # value needed
value = # value needed
}
grpc_ports {
container_port = # value needed
}
health_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
health_route = # value needed
image_uri = # value needed
ports {
container_port = # value needed
}
predict_route = # value needed
shared_memory_size_mb = # value needed
startup_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
}
description = # value needed
encryption_spec {
kms_key_name = # value needed
}
labels = # value needed
metadata {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
metadata_schema_uri = # value needed
model = # value needed
model_id = # value needed
parent_model = # value needed
pipeline_job = # value needed
predict_schemata {
instance_schema_uri = # value needed
parameters_schema_uri = # value needed
prediction_schema_uri = # value needed
}
supported_export_formats {
exportable_content = # value needed
}
version_aliases = # value needed
}
Tests analytics
Total tests: 54 Passed tests: 50 Skipped tests: 3 Affected tests: 1
Click here to see the affected service packages
- vertexai
Action taken
Found 1 affected test(s) by replaying old test recordings. Starting RECORDING based on the most recent commit. Click here to see the affected tests
- TestAccVertexAIModels_vertexAiModelSourceBasicExample
🔴 Tests failed during RECORDING mode:
TestAccVertexAIModels_vertexAiModelSourceBasicExample[Error message] [Debug log]
🔴 Errors occurred during RECORDING mode. Please fix them to complete your PR.
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google provider: Diff ( 5 files changed, 3064 insertions(+), 2 deletions(-))
google-beta provider: Diff ( 5 files changed, 3064 insertions(+), 2 deletions(-))
terraform-google-conversion: Diff ( 1 file changed, 807 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models (1 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
artifact_uri = # value needed
container_spec {
args = # value needed
command = # value needed
deployment_timeout = # value needed
env {
name = # value needed
value = # value needed
}
grpc_ports {
container_port = # value needed
}
health_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
health_route = # value needed
image_uri = # value needed
ports {
container_port = # value needed
}
predict_route = # value needed
shared_memory_size_mb = # value needed
startup_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
}
description = # value needed
encryption_spec {
kms_key_name = # value needed
}
labels = # value needed
metadata {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
metadata_schema_uri = # value needed
model = # value needed
model_id = # value needed
parent_model = # value needed
pipeline_job = # value needed
predict_schemata {
instance_schema_uri = # value needed
parameters_schema_uri = # value needed
prediction_schema_uri = # value needed
}
supported_export_formats {
exportable_content = # value needed
}
version_aliases = # value needed
}
Tests analytics
Total tests: 54 Passed tests: 50 Skipped tests: 3 Affected tests: 1
Click here to see the affected service packages
- vertexai
Action taken
Found 1 affected test(s) by replaying old test recordings. Starting RECORDING based on the most recent commit. Click here to see the affected tests
- TestAccVertexAIModels_vertexAiModelSourceBasicExample
🔴 Tests failed during RECORDING mode:
TestAccVertexAIModels_vertexAiModelSourceBasicExample[Error message] [Debug log]
🔴 Errors occurred during RECORDING mode. Please fix them to complete your PR.
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google provider: Diff ( 5 files changed, 3064 insertions(+), 2 deletions(-))
google-beta provider: Diff ( 5 files changed, 3064 insertions(+), 2 deletions(-))
terraform-google-conversion: Diff ( 1 file changed, 807 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models (1 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
artifact_uri = # value needed
container_spec {
args = # value needed
command = # value needed
deployment_timeout = # value needed
env {
name = # value needed
value = # value needed
}
grpc_ports {
container_port = # value needed
}
health_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
health_route = # value needed
image_uri = # value needed
ports {
container_port = # value needed
}
predict_route = # value needed
shared_memory_size_mb = # value needed
startup_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
}
description = # value needed
encryption_spec {
kms_key_name = # value needed
}
labels = # value needed
metadata {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
metadata_schema_uri = # value needed
model = # value needed
model_id = # value needed
parent_model = # value needed
pipeline_job = # value needed
predict_schemata {
instance_schema_uri = # value needed
parameters_schema_uri = # value needed
prediction_schema_uri = # value needed
}
supported_export_formats {
exportable_content = # value needed
}
version_aliases = # value needed
}
Tests analytics
Total tests: 54 Passed tests: 50 Skipped tests: 3 Affected tests: 1
Click here to see the affected service packages
- vertexai
Action taken
Found 1 affected test(s) by replaying old test recordings. Starting RECORDING based on the most recent commit. Click here to see the affected tests
- TestAccVertexAIModels_vertexAiModelSourceBasicExample
🔴 Tests failed during RECORDING mode:
TestAccVertexAIModels_vertexAiModelSourceBasicExample[Error message] [Debug log]
🔴 Errors occurred during RECORDING mode. Please fix them to complete your PR.
@BBBmau As follow up to my last review, I've got some questions:
- Are we going to use a feature branch for this work or not? Either choice is fine but I don't think that was firmly decided either way.
- Is the 'copy' PR being done before the 'upload' PR?
Re: number 2, could you please mark whichever PR should be done first as non-draft and ready for review? We should then ensure that the PRs only contains code relevant to that creation method (e.g. the pre_create in this PR includes code related to create via upload)
@BBBmau As follow up to my last review, I've got some questions:
1. Are we going to use a feature branch for this work or not? Either choice is fine but I don't think that was firmly decided either way. 2. Is the 'copy' PR being done before the 'upload' PR?Re: number 2, could you please mark whichever PR should be done first as non-draft and ready for review? We should then ensure that the PRs only contains code relevant to that creation method (e.g. the pre_create in this PR includes code related to create via upload)
Yes the idea is to make copy as the base for adding support to the model resource. I'll be addressing these and get to work on separating upload into their appropriate PRs
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google provider: Diff ( 5 files changed, 3062 insertions(+), 2 deletions(-))
google-beta provider: Diff ( 5 files changed, 3062 insertions(+), 2 deletions(-))
terraform-google-conversion: Diff ( 1 file changed, 797 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models (1 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
artifact_uri = # value needed
container_spec {
args = # value needed
command = # value needed
deployment_timeout = # value needed
env {
name = # value needed
value = # value needed
}
grpc_ports {
container_port = # value needed
}
health_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
health_route = # value needed
image_uri = # value needed
ports {
container_port = # value needed
}
predict_route = # value needed
shared_memory_size_mb = # value needed
startup_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
}
description = # value needed
display_name = # value needed
encryption_spec {
kms_key_name = # value needed
}
labels = # value needed
metadata {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
metadata_schema_uri = # value needed
model = # value needed
model_id = # value needed
parent_model = # value needed
pipeline_job = # value needed
predict_schemata {
instance_schema_uri = # value needed
parameters_schema_uri = # value needed
prediction_schema_uri = # value needed
}
supported_export_formats {
exportable_content = # value needed
}
version_aliases = # value needed
}
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google provider: Diff ( 5 files changed, 3032 insertions(+), 2 deletions(-))
google-beta provider: Diff ( 5 files changed, 3032 insertions(+), 2 deletions(-))
terraform-google-conversion: Diff ( 1 file changed, 797 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models (1 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
artifact_uri = # value needed
container_spec {
args = # value needed
command = # value needed
deployment_timeout = # value needed
env {
name = # value needed
value = # value needed
}
grpc_ports {
container_port = # value needed
}
health_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
health_route = # value needed
image_uri = # value needed
ports {
container_port = # value needed
}
predict_route = # value needed
shared_memory_size_mb = # value needed
startup_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
}
description = # value needed
display_name = # value needed
encryption_spec {
kms_key_name = # value needed
}
labels = # value needed
metadata {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
metadata_schema_uri = # value needed
model = # value needed
pipeline_job = # value needed
predict_schemata {
instance_schema_uri = # value needed
parameters_schema_uri = # value needed
prediction_schema_uri = # value needed
}
supported_export_formats {
exportable_content = # value needed
}
version_aliases = # value needed
}
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google provider: Diff ( 5 files changed, 3032 insertions(+), 2 deletions(-))
google-beta provider: Diff ( 5 files changed, 3032 insertions(+), 2 deletions(-))
terraform-google-conversion: Diff ( 1 file changed, 797 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models (1 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
artifact_uri = # value needed
container_spec {
args = # value needed
command = # value needed
deployment_timeout = # value needed
env {
name = # value needed
value = # value needed
}
grpc_ports {
container_port = # value needed
}
health_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
health_route = # value needed
image_uri = # value needed
ports {
container_port = # value needed
}
predict_route = # value needed
shared_memory_size_mb = # value needed
startup_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
}
description = # value needed
display_name = # value needed
encryption_spec {
kms_key_name = # value needed
}
labels = # value needed
metadata {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
metadata_schema_uri = # value needed
model = # value needed
pipeline_job = # value needed
predict_schemata {
instance_schema_uri = # value needed
parameters_schema_uri = # value needed
prediction_schema_uri = # value needed
}
supported_export_formats {
exportable_content = # value needed
}
version_aliases = # value needed
}
Tests analytics
Total tests: 54 Passed tests: 50 Skipped tests: 3 Affected tests: 1
Click here to see the affected service packages
- vertexai
Action taken
Found 1 affected test(s) by replaying old test recordings. Starting RECORDING based on the most recent commit. Click here to see the affected tests
- TestAccVertexAIModels_vertexAiModelSourceBasicExample
🔴 Tests failed during RECORDING mode:
TestAccVertexAIModels_vertexAiModelSourceBasicExample [Error message] [Debug log]
🔴 Errors occurred during RECORDING mode. Please fix them to complete your PR.