PaddleFleetX icon indicating copy to clipboard operation
PaddleFleetX copied to clipboard

AttributeError: 'Fleet' object has no attribute 'DistributedStrategy'

Open llzhaoshuo opened this issue 4 years ago • 1 comments

当使用示例代码进行调试时报错: Traceback (most recent call last): File "test.py", line 17, in dist_strategy = fleet.DistributedStrategy() AttributeError: 'Fleet' object has no attribute 'DistributedStrategy'

版本信息: paddlepaddle-gpu==2.0rc1,fleet-x==0.0.7

请问这是什么原因呢?

llzhaoshuo avatar Dec 30 '20 02:12 llzhaoshuo

试了一下没有问题,请检查一下环境配置和paddle2.0.0rc1安装路径是否正确。

import paddle paddle.version '2.0.0-rc1' import paddle.distributed.fleet as fleet dist_strategy = fleet.DistributedStrategy() print(dist_strategy)

+==============================================================================+
|                                                                              |
|                         DistributedStrategy Overview                         |
|                                                                              |
+==============================================================================+
|                        a_sync=True <-> a_sync_configs                        |
+------------------------------------------------------------------------------+
|                               k_steps                    -1                  |
|                     max_merge_var_num                    1                   |
|                       send_queue_size                    16                  |
|               independent_recv_thread                  False                 |
|         min_send_grad_num_before_recv                    1                   |
|                      thread_pool_size                    1                   |
|                       send_wait_times                    1                   |
|               runtime_split_send_recv                  False                 |
|                        launch_barrier                   True                 |
|             heter_worker_device_guard                   cpu                  |
+==============================================================================+
|                    Environment Flags, Communication Flags                    |
+------------------------------------------------------------------------------+
|                                  mode                    1                   |
|                               elastic                  False                 |
|                                  auto                  False                 |
|                   sync_nccl_allreduce                   True                 |
|                         nccl_comm_num                    1                   |
|            use_hierarchical_allreduce                  False                 |
|   hierarchical_allreduce_inter_nranks                    1                   |
|                       sync_batch_norm                  False                 |
|                   fuse_all_reduce_ops                   True                 |
|                  fuse_grad_size_in_MB                    32                  |
|              fuse_grad_size_in_TFLOPS                   50.0                 |
|               cudnn_exhaustive_search                   True                 |
|             conv_workspace_size_limit                   4000                 |
|    cudnn_batchnorm_spatial_persistent                   True                 |
|                        fp16_allreduce                  False                 |
|               last_comm_group_size_MB                   1.0                  |
+==============================================================================+

gentelyang avatar Jan 11 '21 09:01 gentelyang