thebeancounter
thebeancounter
this code worked for me ``` def sync_s3_to_local(bucket, local): import boto3 import os mkdir(local) files = set(os.listdir(local)) s3 = boto3.resource('s3') bucket = s3.Bucket(bucket) for obj in bucket.objects.all(): file = obj.key...
Having a similar issue with dask array @TomAugspurger see my [SO question](https://stackoverflow.com/questions/56583112/hot-to-avoid-python-dask-logistic-regression-multiple-constant-columns-detected), Any idea?
@TomAugspurger Hi. The code is in the SO question, do you mean copy it here?
@TomAugspurger Data is defined It's regular cifar10 data, passed via a pre trained resnet 50 for feature extraction. Trains well with sklearn. I can't guarantee that there are no zero...
@TomAugspurger Hi, I posted the code and the data. It's a solid example :-) Anyhow, Can you maybe post a working example for using numpy array for logistic regression in...
@TomAugspurger my data originally comes from a numpy array, I need to convert it to some form that dask can learn on. Can't find any example for that in the...
@TomAugspurger Scikit learn will not utilize the machines cores, and takes way way way too long to run... Looking for a multithreaded solution.
@xiaozhongtian can you please clarify? are you asking a question? Not sure I see the connection to this thread.
Hi, I am hitting this too when trying to use the pairwise distance norm of a batch of vectors, here is the [SO](https://stackoverflow.com/questions/54346263/tensorflow-gradient-getting-all-nan-values) question, am I getting this issue because...