cloud icon indicating copy to clipboard operation
cloud copied to clipboard

The TensorFlow Cloud repository provides APIs that will allow to easily go from debugging and training your Keras and TensorFlow code in a local environment to distributed training in the cloud.

Results 78 cloud issues
Sort by recently updated
recently updated
newest added

From: @yixingfu Path ending with / makes a problem. For example, when setting path as ‘gs://[BUCKET_NAME]/saves’ works fine, but ‘gs://[BUCKET_NAME]/saves/’ breaks down when trying to reload in remote. For example,...

bug

Currently to re run tests need to submit an empty commit ```console git commit -m "retest" --allow-empty and then git push ``` This is a request for a nice to...

enhancement

Add an integration test example with python script on multiple files.

enhancement

Feedback from Yixing Fu: Maybe it would be better for the tutorial to suggest the bucket approach at first over local build, as cloud-build is more consistent?

bug

My current workflow is to use the gcloud CLI to submit jobs with a common base image and differ the parameters (used in a CI setting). I like that in...

enhancement

I started a job with log streaming enabled. While the job was running I manually terminated the job in AI Platform training by clicking on stop job in the UI....

bug

Either tfc.run() should return the job_id or have a method that allows to retrieve the job id. Also a follow up ask ( nice to have) here would be to...

enhancement

sample ```python import tensorflow_cloud as tfc # Automated MirroredStrategy: chief config with multiple GPUs tfc.run( entry_point="../../tests/testdata/mnist_example_using_fit_no_reqs.py", distribution_strategy="auto", chief_config=tfc.MachineConfig( cpu_cores=8, memory=30, accelerator_type=tfc.AcceleratorType.NVIDIA_TESLA_P100, accelerator_count=2, ), worker_count=0, stream_logs=False, docker_image_bucket_name="", ) ``` output ```shell...

bug

This is particularly useful in case of automation and rapid iterations. It allows for easy tracking of the execution result, without needing to keep track of generated IDs and map...

enhancement

After some experimenting, using keras-tuner with chief-worker distribution can (only) be done by setting `KERASTUNER_ORACLE_IP` to 0.0.0.0 on chief, but the actual chief IP obtained from TFCONFIG on worker replica....

enhancement