magic-modules
magic-modules copied to clipboard
Add `vertex_ai_model_resource` :upload
Fixes https://github.com/hashicorp/terraform-provider-google/issues/15303
Release Note Template for Downstream PRs (will be copied)
`google_vertex_ai_model`
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
Terraform GA: Diff ( 3 files changed, 2734 insertions(+), 2 deletions(-)) Terraform Beta: Diff ( 3 files changed, 2734 insertions(+), 2 deletions(-)) TF Conversion: Diff ( 1 file changed, 799 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models
(0 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
artifact_uri = # value needed
container_spec {
args = # value needed
command = # value needed
deployment_timeout = # value needed
env {
name = # value needed
value = # value needed
}
grpc_ports {
container_port = # value needed
}
health_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
health_route = # value needed
image_uri = # value needed
ports {
container_port = # value needed
}
predict_route = # value needed
shared_memory_size_mb = # value needed
startup_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
}
description = # value needed
display_name = # value needed
encryption_spec {
kms_key_name = # value needed
}
labels = # value needed
metadata {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
metadata_schema_uri = # value needed
model = # value needed
model_id = # value needed
name = # value needed
parent_model = # value needed
pipeline_job = # value needed
predict_schemata {
instance_schema_uri = # value needed
parameters_schema_uri = # value needed
prediction_schema_uri = # value needed
}
region = # value needed
source_model = # value needed
version_aliases = # value needed
}
Tests analytics
Total tests: 45
Passed tests 45
Skipped tests: 0
Affected tests: 0
Click here to see the affected service packages
- vertexai
$\textcolor{green}{\textsf{All tests passed in REPLAYING mode.}}$ View the build log
Hello! Curious if there's any ETA on this. Would be super helpful to have this in the Google TF provider!
@stephaneden I'm back to working on this, it's been on and off the past few months 👍🏼
Cool! Looking forward to its availability! Thanks for the quick reply!
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google
provider: Diff ( 3 files changed, 2734 insertions(+), 2 deletions(-))
google-beta
provider: Diff ( 3 files changed, 2734 insertions(+), 2 deletions(-))
terraform-google-conversion
: Diff ( 1 file changed, 799 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models
(0 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
artifact_uri = # value needed
container_spec {
args = # value needed
command = # value needed
deployment_timeout = # value needed
env {
name = # value needed
value = # value needed
}
grpc_ports {
container_port = # value needed
}
health_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
health_route = # value needed
image_uri = # value needed
ports {
container_port = # value needed
}
predict_route = # value needed
shared_memory_size_mb = # value needed
startup_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
}
description = # value needed
display_name = # value needed
encryption_spec {
kms_key_name = # value needed
}
labels = # value needed
metadata {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
metadata_schema_uri = # value needed
model = # value needed
model_id = # value needed
name = # value needed
parent_model = # value needed
pipeline_job = # value needed
predict_schemata {
instance_schema_uri = # value needed
parameters_schema_uri = # value needed
prediction_schema_uri = # value needed
}
region = # value needed
source_model = # value needed
version_aliases = # value needed
}
Tests analytics
Total tests: 46
Passed tests: 46
Skipped tests: 0
Affected tests: 0
Click here to see the affected service packages
- vertexai
$\textcolor{green}{\textsf{All tests passed!}}$ View the build log
hi @BBBmau
Thank you very much for working on this.
Would it be possible to include the network
flag as well?
Note: the operation response does not include name like other resources. This is the operation response we get once finished:
{
"name": "projects/110495173025/locations/us-central1/models/2955453170601426944/operations/7180409647818342400",
"metadata": {
"@type": "type.googleapis.com/google.cloud.aiplatform.v1.CopyModelOperationMetadata",
"genericMetadata": {
"createTime": "2024-04-17T22:55:53.532557Z",
"updateTime": "2024-04-17T22:56:04.272194Z"
}
},
"done": true,
"response": {
"@type": "type.googleapis.com/google.cloud.aiplatform.v1.CopyModelResponse",
"model": "projects/110495173025/locations/us-central1/models/2955453170601426944",
"modelVersionId": "1"
}
}
this leads to needing to flatten model
instead of name
. This could be done with a postCreate but will look into being able to change the value that's flattened within operation: !ruby/object:Api::OpAsync::Operation
line 739 in resource_vertex_ai_models.go
would need to be changed to:
if err := d.Set("model", flattenVertexAIModelsModel(opRes["model"], d, config)); err != nil {
return err
}
hi @BBBmau Thank you very much for working on this. Would it be possible to include the
network
flag as well? !)
This will need some investigation since adding model_resource
support is based on the REST API from docs. I don't see any mention of this there, this can be looked into further once this PR is complete.
As of now, this PR will be the base for supporting the different types of vertex-ai-models
You are able to provision a bare-minimum model (which is just displayName) with the tfconfig below:
resource "google_vertex_ai_models" "copy" {
project = "hc-17ccbe3c280c434a87e301c7c95"
display_name = "tf-test-model-upload"
# source_model = "projects/hc-17ccbe3c280c434a87e301c7c95/locations/us-central1/models/5460298988349554688"
region = "us-central1"
}
The logic was updated a bit so that :upload
method will be triggered when display_name
is filled out by the user since it is a required (the only required field) for creating a model through TF
:copy
method is triggered only when source_model
is set by user
I've commented out what I had added for model inputs for now since the idea is to add them based on which model type is being worked on. The first model type will be Tabular
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google
provider: Diff ( 5 files changed, 3013 insertions(+), 2 deletions(-))
google-beta
provider: Diff ( 5 files changed, 3013 insertions(+), 2 deletions(-))
terraform-google-conversion
: Diff ( 1 file changed, 780 insertions(+))
Open in Cloud Shell: Diff ( 4 files changed, 114 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models
(1 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
artifact_uri = # value needed
container_spec {
args = # value needed
command = # value needed
deployment_timeout = # value needed
env {
name = # value needed
value = # value needed
}
grpc_ports {
container_port = # value needed
}
health_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
health_route = # value needed
image_uri = # value needed
ports {
container_port = # value needed
}
predict_route = # value needed
shared_memory_size_mb = # value needed
startup_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
}
metadata {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
metadata_schema_uri = # value needed
pipeline_job = # value needed
predict_schemata {
instance_schema_uri = # value needed
parameters_schema_uri = # value needed
prediction_schema_uri = # value needed
}
source_model = # value needed
supported_export_formats {
exportable_content = # value needed
}
version_aliases = # value needed
}
Tests analytics
Total tests: 47
Passed tests: 46
Skipped tests: 0
Affected tests: 1
Click here to see the affected service packages
- vertexai
Action taken
Found 1 affected test(s) by replaying old test recordings. Starting RECORDING based on the most recent commit. Click here to see the affected tests
TestAccVertexAIModels_vertexAiModelBasicExample
$\textcolor{red}{\textsf{Tests failed during RECORDING mode:}}$
TestAccVertexAIModels_vertexAiModelBasicExample
[Error message] [Debug log]
$\textcolor{red}{\textsf{Please fix these to complete your PR.}}$ View the build log or the debug log for each test
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google
provider: Diff ( 5 files changed, 3268 insertions(+), 2 deletions(-))
google-beta
provider: Diff ( 5 files changed, 3268 insertions(+), 2 deletions(-))
terraform-google-conversion
: Diff ( 1 file changed, 819 insertions(+))
Open in Cloud Shell: Diff ( 8 files changed, 224 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models
(2 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
artifact_uri = # value needed
container_spec {
args = # value needed
command = # value needed
deployment_timeout = # value needed
env {
name = # value needed
value = # value needed
}
grpc_ports {
container_port = # value needed
}
health_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
health_route = # value needed
image_uri = # value needed
ports {
container_port = # value needed
}
predict_route = # value needed
shared_memory_size_mb = # value needed
startup_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
}
deployed_models {
deployed_model_id = # value needed
endpoint = # value needed
}
encryption_spec {
kms_key_name = # value needed
}
metadata {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
metadata_schema_uri = # value needed
parent_model = # value needed
predict_schemata {
instance_schema_uri = # value needed
parameters_schema_uri = # value needed
prediction_schema_uri = # value needed
}
source_model = # value needed
supported_export_formats {
exportable_content = # value needed
}
version_aliases = # value needed
}
Tests analytics
Total tests: 48
Passed tests: 46
Skipped tests: 0
Affected tests: 2
Click here to see the affected service packages
- vertexai
Action taken
Found 2 affected test(s) by replaying old test recordings. Starting RECORDING based on the most recent commit. Click here to see the affected tests
TestAccVertexAIModels_vertexAiModelBasicExample|TestAccVertexAIModels_vertexAiModelUploadBasicExample
$\textcolor{red}{\textsf{Tests failed during RECORDING mode:}}$
TestAccVertexAIModels_vertexAiModelBasicExample
[Error message] [Debug log]
TestAccVertexAIModels_vertexAiModelUploadBasicExample
[Error message] [Debug log]
$\textcolor{red}{\textsf{Please fix these to complete your PR.}}$ View the build log or the debug log for each test
started working on tests for the resource, was able to complete a basic
config where the minimal config makes uploading a model possible, next would be supporting container_spec
test which would involve filling out the following:
container_spec {
args = # value needed
command = # value needed
deployment_timeout = # value needed
env {
name = # value needed
value = # value needed
}
grpc_ports {
container_port = # value needed
}
health_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
health_route = # value needed
image_uri = # value needed
ports {
container_port = # value needed
}
predict_route = # value needed
shared_memory_size_mb = # value needed
startup_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
}
the magician error that shows the missing values will become smaller as more tests are added.
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google
provider: Diff ( 5 files changed, 3269 insertions(+), 2 deletions(-))
google-beta
provider: Diff ( 5 files changed, 3269 insertions(+), 2 deletions(-))
terraform-google-conversion
: Diff ( 1 file changed, 819 insertions(+))
Open in Cloud Shell: Diff ( 8 files changed, 224 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models
(2 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
artifact_uri = # value needed
container_spec {
args = # value needed
command = # value needed
deployment_timeout = # value needed
env {
name = # value needed
value = # value needed
}
grpc_ports {
container_port = # value needed
}
health_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
health_route = # value needed
image_uri = # value needed
ports {
container_port = # value needed
}
predict_route = # value needed
shared_memory_size_mb = # value needed
startup_probe {
exec {
command = # value needed
}
period_seconds = # value needed
timeout_seconds = # value needed
}
}
deployed_models {
deployed_model_id = # value needed
endpoint = # value needed
}
encryption_spec {
kms_key_name = # value needed
}
metadata {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
metadata_schema_uri = # value needed
parent_model = # value needed
predict_schemata {
instance_schema_uri = # value needed
parameters_schema_uri = # value needed
prediction_schema_uri = # value needed
}
source_model = # value needed
supported_export_formats {
exportable_content = # value needed
}
version_aliases = # value needed
}
Tests analytics
Total tests: 48
Passed tests: 46
Skipped tests: 0
Affected tests: 2
Click here to see the affected service packages
- vertexai
Action taken
Found 2 affected test(s) by replaying old test recordings. Starting RECORDING based on the most recent commit. Click here to see the affected tests
TestAccVertexAIModels_vertexAiModelBasicExample|TestAccVertexAIModels_vertexAiModelUploadBasicExample
$\textcolor{green}{\textsf{Tests passed during RECORDING mode:}}$
TestAccVertexAIModels_vertexAiModelUploadBasicExample
[Debug log]
$\textcolor{green}{\textsf{No issues found for passed tests after REPLAYING rerun.}}$
$\textcolor{red}{\textsf{Tests failed during RECORDING mode:}}$
TestAccVertexAIModels_vertexAiModelBasicExample
[Error message] [Debug log]
$\textcolor{red}{\textsf{Please fix these to complete your PR.}}$ View the build log or the debug log for each test
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google
provider: Diff ( 5 files changed, 6938 insertions(+), 2 deletions(-))
google-beta
provider: Diff ( 5 files changed, 6938 insertions(+), 2 deletions(-))
terraform-google-conversion
: Diff ( 1 file changed, 1972 insertions(+))
Open in Cloud Shell: Diff ( 16 files changed, 550 insertions(+))
Errors
google
provider:
- The diff processor failed to build. This is usually due to the downstream provider failing to compile.
google-beta
provider:
- The diff processor failed to build. This is usually due to the downstream provider failing to compile.
Tests analytics
Total tests: 0
Passed tests: 0
Skipped tests: 0
Affected tests: 0
Click here to see the affected service packages
- vertexai
Non-exercised tests
Tests were added that are skipped in VCR:
- TestAccVertexAIModels_vertexAiModelBasicExample
- TestAccVertexAIModels_vertexAiModelUploadBasicExample
- TestAccVertexAIModels_vertexAiModelUploadContainerSpecExample
- TestAccVertexAIModels_vertexAiModelUploadPredictSchemataExample
$\textcolor{red}{\textsf{Errors occurred during REPLAYING mode. Please fix them to complete your PR.}}$ View the build log
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google
provider: Diff ( 5 files changed, 6666 insertions(+), 2 deletions(-))
google-beta
provider: Diff ( 5 files changed, 6666 insertions(+), 2 deletions(-))
terraform-google-conversion
: Diff ( 1 file changed, 1931 insertions(+))
Open in Cloud Shell: Diff ( 16 files changed, 550 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models
(4 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
container_spec {
deployment_timeout = # value needed
shared_memory_size_mb = # value needed
}
explanation_spec {
metadata {
inputs {
dense_shape_tensor_name = # value needed
encoded_baselines = # value needed
encoded_tensor_name = # value needed
encoding = # value needed
feature_value_domain {
max_value = # value needed
min_value = # value needed
original_mean = # value needed
original_stddev = # value needed
}
group_name = # value needed
index_feature_mapping = # value needed
indices_tensor_name = # value needed
input_tensor_name = # value needed
modality = # value needed
name = # value needed
sha1_sum = # value needed
visualization {
clip_percent_lowerbound = # value needed
clip_percent_upperbound = # value needed
color_map = # value needed
overlay_type = # value needed
polarity = # value needed
type = # value needed
}
}
latent_space_source = # value needed
outputs {
display_name_mapping_key = # value needed
index_display_name_mapping = # value needed
name = # value needed
output_tensor_name = # value needed
}
}
parameters {
examples {
example_gcs_source {
data_format = # value needed
gcs_source {
uris = # value needed
}
}
nearest_neighbor_search_config {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
neighbor_count = # value needed
presets {
modality = # value needed
query = # value needed
}
}
integrated_gradients_attribution {
blur_baseline_config {
max_blur_sigma = # value needed
}
smooth_grad_config {
feature_noise_sigma {
noise_sigma {
name = # value needed
sigma = # value needed
}
}
noise_sigma = # value needed
noisy_sample_count = # value needed
}
step_count = # value needed
}
output_indices = # value needed
sampled_shapley_attribution {
path_count = # value needed
}
top_k = # value needed
xrai_attribution {
blur_baseline_config {
max_blur_sigma = # value needed
}
smooth_grad_config {
feature_noise_sigma {
noise_sigma {
name = # value needed
sigma = # value needed
}
}
noise_sigma = # value needed
noisy_sample_count = # value needed
}
step_count = # value needed
}
}
}
model = # value needed
model_id = # value needed
parent_model = # value needed
source_model = # value needed
version_aliases = # value needed
}
Tests analytics
Total tests: 50
Passed tests: 43
Skipped tests: 0
Affected tests: 7
Click here to see the affected service packages
- vertexai
Action taken
Found 7 affected test(s) by replaying old test recordings. Starting RECORDING based on the most recent commit. Click here to see the affected tests
TestAccVertexAIEndpointIamBinding|TestAccVertexAIEndpointIamPolicy|TestAccVertexAIFeatureOnlineStore_vertexAiFeatureonlinestoreWithBetaFieldsOptimizedExample|TestAccVertexAIModels_vertexAiModelBasicExample|TestAccVertexAIModels_vertexAiModelUploadBasicExample|TestAccVertexAIModels_vertexAiModelUploadContainerSpecExample|TestAccVertexAIModels_vertexAiModelUploadPredictSchemataExample
$\textcolor{green}{\textsf{Tests passed during RECORDING mode:}}$
TestAccVertexAIEndpointIamPolicy
[Debug log]
$\textcolor{green}{\textsf{No issues found for passed tests after REPLAYING rerun.}}$
$\textcolor{red}{\textsf{Tests failed during RECORDING mode:}}$
TestAccVertexAIEndpointIamBinding
[Error message] [Debug log]
TestAccVertexAIFeatureOnlineStore_vertexAiFeatureonlinestoreWithBetaFieldsOptimizedExample
[Error message] [Debug log]
TestAccVertexAIModels_vertexAiModelBasicExample
[Error message] [Debug log]
TestAccVertexAIModels_vertexAiModelUploadBasicExample
[Error message] [Debug log]
TestAccVertexAIModels_vertexAiModelUploadContainerSpecExample
[Error message] [Debug log]
TestAccVertexAIModels_vertexAiModelUploadPredictSchemataExample
[Error message] [Debug log]
$\textcolor{red}{\textsf{Please fix these to complete your PR.}}$ View the build log or the debug log for each test
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google
provider: Diff ( 5 files changed, 6659 insertions(+), 2 deletions(-))
google-beta
provider: Diff ( 5 files changed, 6659 insertions(+), 2 deletions(-))
terraform-google-conversion
: Diff ( 1 file changed, 1931 insertions(+))
Open in Cloud Shell: Diff ( 16 files changed, 547 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models
(4 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
container_spec {
deployment_timeout = # value needed
shared_memory_size_mb = # value needed
}
deployed_models {
deployed_model_id = # value needed
endpoint = # value needed
}
explanation_spec {
metadata {
inputs {
dense_shape_tensor_name = # value needed
encoded_baselines = # value needed
encoded_tensor_name = # value needed
encoding = # value needed
feature_value_domain {
max_value = # value needed
min_value = # value needed
original_mean = # value needed
original_stddev = # value needed
}
group_name = # value needed
index_feature_mapping = # value needed
indices_tensor_name = # value needed
input_tensor_name = # value needed
modality = # value needed
name = # value needed
sha1_sum = # value needed
visualization {
clip_percent_lowerbound = # value needed
clip_percent_upperbound = # value needed
color_map = # value needed
overlay_type = # value needed
polarity = # value needed
type = # value needed
}
}
latent_space_source = # value needed
outputs {
display_name_mapping_key = # value needed
index_display_name_mapping = # value needed
name = # value needed
output_tensor_name = # value needed
}
}
parameters {
examples {
example_gcs_source {
data_format = # value needed
gcs_source {
uris = # value needed
}
}
nearest_neighbor_search_config {
config {
algorithm_config {
tree_ah_config {
leaf_node_embedding_count = # value needed
leaf_nodes_to_search_percent = # value needed
}
}
approximate_neighbors_count = # value needed
dimensions = # value needed
distance_measure_type = # value needed
feature_norm_type = # value needed
shard_size = # value needed
}
contents_delta_uri = # value needed
is_complete_overwrite = # value needed
}
neighbor_count = # value needed
presets {
modality = # value needed
query = # value needed
}
}
integrated_gradients_attribution {
blur_baseline_config {
max_blur_sigma = # value needed
}
smooth_grad_config {
feature_noise_sigma {
noise_sigma {
name = # value needed
sigma = # value needed
}
}
noise_sigma = # value needed
noisy_sample_count = # value needed
}
step_count = # value needed
}
output_indices = # value needed
sampled_shapley_attribution {
path_count = # value needed
}
top_k = # value needed
xrai_attribution {
blur_baseline_config {
max_blur_sigma = # value needed
}
smooth_grad_config {
feature_noise_sigma {
noise_sigma {
name = # value needed
sigma = # value needed
}
}
noise_sigma = # value needed
noisy_sample_count = # value needed
}
step_count = # value needed
}
}
}
model = # value needed
model_id = # value needed
parent_model = # value needed
source_model = # value needed
version_aliases = # value needed
}
Tests analytics
Total tests: 50
Passed tests: 44
Skipped tests: 0
Affected tests: 6
Click here to see the affected service packages
- vertexai
Action taken
Found 6 affected test(s) by replaying old test recordings. Starting RECORDING based on the most recent commit. Click here to see the affected tests
TestAccVertexAIEndpointIamBinding|TestAccVertexAIFeatureOnlineStore_vertexAiFeatureonlinestoreWithBetaFieldsOptimizedExample|TestAccVertexAIModels_vertexAiModelBasicExample|TestAccVertexAIModels_vertexAiModelUploadBasicExample|TestAccVertexAIModels_vertexAiModelUploadContainerSpecExample|TestAccVertexAIModels_vertexAiModelUploadPredictSchemataExample
$\textcolor{green}{\textsf{Tests passed during RECORDING mode:}}$
TestAccVertexAIEndpointIamBinding
[Debug log]
TestAccVertexAIModels_vertexAiModelUploadContainerSpecExample
[Debug log]
$\textcolor{green}{\textsf{No issues found for passed tests after REPLAYING rerun.}}$
$\textcolor{red}{\textsf{Tests failed during RECORDING mode:}}$
TestAccVertexAIFeatureOnlineStore_vertexAiFeatureonlinestoreWithBetaFieldsOptimizedExample
[Error message] [Debug log]
TestAccVertexAIModels_vertexAiModelBasicExample
[Error message] [Debug log]
TestAccVertexAIModels_vertexAiModelUploadBasicExample
[Error message] [Debug log]
TestAccVertexAIModels_vertexAiModelUploadPredictSchemataExample
[Error message] [Debug log]
$\textcolor{red}{\textsf{Please fix these to complete your PR.}}$ View the build log or the debug log for each test
Hi there, I'm the Modular magician. I've detected the following information about your changes:
Diff report
Your PR generated some diffs in downstreams - here they are.
google
provider: Diff ( 6 files changed, 6917 insertions(+), 8 deletions(-))
google-beta
provider: Diff ( 6 files changed, 6917 insertions(+), 8 deletions(-))
terraform-google-conversion
: Diff ( 1 file changed, 1931 insertions(+))
Open in Cloud Shell: Diff ( 20 files changed, 760 insertions(+))
Missing test report
Your PR includes resource fields which are not covered by any test.
Resource: google_vertex_ai_models
(5 total tests)
Please add an acceptance test which includes these fields. The test should include the following:
resource "google_vertex_ai_models" "primary" {
container_spec {
deployment_timeout = # value needed
shared_memory_size_mb = # value needed
}
deployed_models {
deployed_model_id = # value needed
endpoint = # value needed
}
model = # value needed
model_id = # value needed
parent_model = # value needed
source_model = # value needed
supported_export_formats {
exportable_content = # value needed
}
version_aliases = # value needed
}