If you are opening a PR for Official Notebooks under the notebooks/official folder, follow this mandatory checklist:
- [ ] Use the notebook template as a starting point.
- [ ] Follow the style and grammar rules outlined in the above notebook template.
- [ ] Verify the notebook runs successfully in Colab since the automated tests cannot guarantee this even when it passes.
- [ ] Passes all the required automated checks. You can locally test for formatting and linting with these instructions.
- [ ] You have consulted with a tech writer to see if tech writer review is necessary. If so, the notebook has been reviewed by a tech writer, and they have approved it.
- [ ] This notebook has been added to the CODEOWNERS file under the
Official Notebooks section, pointing to the author or the author's team.
- [ ] The Jupyter notebook cleans up any artifacts it has created (datasets, ML models, endpoints, etc) so as not to eat up unnecessary resources.
If you are opening a PR for Community Notebooks under the notebooks/community folder:
- [ ] This notebook has been added to the CODEOWNERS file under the
Community Notebooks section, pointing to the author or the author's team.
- [ ] Passes all the required formatting and linting checks. You can locally test with these instructions.
If you are opening a PR for Community Content under the community-content folder:
- [ ] Make sure your main
Content Directory Name is descriptive, informative, and includes some of the key products and attributes of your content, so that it is differentiable from other content
- [ ] The main content directory has been added to the CODEOWNERS file under the
Community Content section, pointing to the author or the author's team.
- [ ] Passes all the required formatting and linting checks. You can locally test with these instructions.
Check out this pull request on 
See visual diffs & provide feedback on Jupyter Notebooks.
Powered by ReviewNB
Step #6: 2 for i in range(0, len(runs)):
Step #6: 3 pipeline_job = vertex_ai.PipelineJob.get(pipeline_experiments_df.run_name[i])
Step #6: ----> 4 pipeline_job.delete()
Step #6:
Step #6: File /workspace/workspace/env/lib/python3.9/site-packages/google/cloud/aiplatform/base.py:788, in optional_sync..optional_run_in_thread..wrapper(*args, **kwargs)
Step #6: 786 if self:
Step #6: 787 VertexAiResourceNounWithFutureManager.wait(self)
Step #6: --> 788 return method(*args, **kwargs)
Step #6: 790 # callbacks to call within the Future (in same Thread)
Step #6: 791 internal_callbacks = []
Step #6:
Step #6: File /workspace/workspace/env/lib/python3.9/site-packages/google/cloud/aiplatform/base.py:1206, in VertexAiResourceNounWithFutureManager.delete(self, sync)
Step #6: 1196 """Deletes this Vertex AI resource. WARNING: This deletion is
Step #6: 1197 permanent.
Step #6: 1198
Step #6: (...)
Step #6: 1203 be immediately returned and synced when the Future has completed.
Step #6: 1204 """
Step #6: 1205 _LOGGER.log_action_start_against_resource("Deleting", "", self)
Step #6: -> 1206 lro = getattr(self.api_client, self._delete_method)(name=self.resource_name)
Step #6: 1207 _LOGGER.log_action_started_against_resource_with_lro(
Step #6: 1208 "Delete", "", self.class, lro
Step #6: 1209 )
Step #6: 1210 lro.result()
Step #6:
Step #6: File /workspace/workspace/env/lib/python3.9/site-packages/google/cloud/aiplatform_v1/services/pipeline_service/client.py:1599, in PipelineServiceClient.delete_pipeline_job(self, request, name, retry, timeout, metadata)
Step #6: 1594 metadata = tuple(metadata) + (
Step #6: 1595 gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)),
Step #6: 1596 )
Step #6: 1598 # Send the request.
Step #6: -> 1599 response = rpc(
Step #6: 1600 request,
Step #6: 1601 retry=retry,
Step #6: 1602 timeout=timeout,
Step #6: 1603 metadata=metadata,
Step #6: 1604 )
Step #6: 1606 # Wrap the response in an operation future.
Step #6: 1607 response = gac_operation.from_gapic(
Step #6: 1608 response,
Step #6: 1609 self._transport.operations_client,
Step #6: 1610 empty_pb2.Empty,
Step #6: 1611 metadata_type=gca_operation.DeleteOperationMetadata,
Step #6: 1612 )
Step #6:
Step #6: File /workspace/workspace/env/lib/python3.9/site-packages/google/api_core/gapic_v1/method.py:154, in _GapicCallable.call(self, timeout, retry, *args, **kwargs)
Step #6: 151 metadata.extend(self._metadata)
Step #6: 152 kwargs["metadata"] = metadata
Step #6: --> 154 return wrapped_func(*args, **kwargs)
Step #6:
Step #6: File /workspace/workspace/env/lib/python3.9/site-packages/google/api_core/grpc_helpers.py:52, in wrap_unary_errors..error_remapped_callable(*args, **kwargs)
Step #6: 50 return callable(*args, **kwargs)
Step #6: 51 except grpc.RpcError as exc:
Step #6: ---> 52 raise exceptions.from_grpc_error(exc) from exc
Step #6:
Step #6: FailedPrecondition: 400 The PipelineJob "projects/1012616486416/locations/us-central1/pipelineJobs/custom-training-pipeline-20220701010940" is in state "RUNNING", and cannot be deleted. Please cancel it or wait for its completion before trying to delete it again.
Step #6:
Finished Step #6
ERROR
ERROR: build step 6 "gcr.io/cloud-devrel-public-resources/python-samples-testing-docker:latest" failed: step exited with non-zero status: 1