Datasets:

Modalities:
Image
Formats:
parquet
ArXiv:
Libraries:
Datasets
pandas
License:
adamnarozniak commited on
Commit
3dfdf8f
·
verified ·
1 Parent(s): 576e4b9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +61 -2
README.md CHANGED
@@ -41,11 +41,11 @@ size_categories:
41
  ---
42
  # Dataset Card for Fed-ISIC-2019
43
 
44
- Federated version of ISIC-2019 Datasets ([ISIC2019 challenge](https://challenge.isic-archive.com/landing/2019/) and the [HAM1000 database](https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/DBW86T)).The dataset contains 6 clients whose data comes from different data centers.
45
 
46
  ## Dataset Details
47
 
48
- The dataset contains 23,247 images of skin lesions. The number of samples for train/test per each data center is displayed in the table below.
49
 
50
  | center_id | Train | Test |
51
  |:---------:|:-------:|:------:|
@@ -87,6 +87,65 @@ partition_train = fds.load_partition(partition_id=0, split="train")
87
  partition_test = fds.load_partition(partition_id=0, split="test")
88
  ```
89
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
90
  ## Dataset Structure
91
 
92
  ### Data Instances
 
41
  ---
42
  # Dataset Card for Fed-ISIC-2019
43
 
44
+ Federated version of ISIC-2019 Datasets ([ISIC2019 challenge](https://challenge.isic-archive.com/landing/2019/) and the [HAM1000 database](https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/DBW86T)). This implementation is derived based on the [FLamby](https://github.com/owkin/FLamby/blob/main/flamby/datasets/fed_isic2019/README.md) implementation.
45
 
46
  ## Dataset Details
47
 
48
+ The dataset contains 23,247 images of skin lesions divided among 6 clients representing different data centers. The number of samples for training/testing per data center is displayed in the table below:
49
 
50
  | center_id | Train | Test |
51
  |:---------:|:-------:|:------:|
 
87
  partition_test = fds.load_partition(partition_id=0, split="test")
88
  ```
89
 
90
+ ```
91
+ # Note: to keep the same results as in FLamby, please apply the following transformation
92
+ import albumentations
93
+ import random
94
+ import numpy as np
95
+ import torch
96
+
97
+
98
+ # Train dataset transformations
99
+ def apply_train_transforms(image_input):
100
+ print(image_input)
101
+ size = 200
102
+ train_transforms = albumentations.Compose(
103
+ [
104
+ albumentations.RandomScale(0.07),
105
+ albumentations.Rotate(50),
106
+ albumentations.RandomBrightnessContrast(0.15, 0.1),
107
+ albumentations.Flip(p=0.5),
108
+ albumentations.Affine(shear=0.1),
109
+ albumentations.RandomCrop(size, size),
110
+ albumentations.CoarseDropout(random.randint(1, 8), 16, 16),
111
+ albumentations.Normalize(always_apply=True),
112
+ ]
113
+ )
114
+ images = []
115
+ for image in image_input["image"]:
116
+ augmented = train_transforms(image=np.array(image))["image"]
117
+ transposed = np.transpose(augmented, (2, 0, 1)).astype(np.float32)
118
+ images.append(torch.tensor(transposed, dtype=torch.float32))
119
+ image_input["image"] = images
120
+ return image_input
121
+
122
+
123
+ partition_train = partition_train.with_transform(apply_train_transforms,
124
+ columns="image")
125
+
126
+ # Test dataset transformations
127
+ def apply_test_transforms(image_input):
128
+ print(image_input)
129
+ size = 200
130
+ test_transforms = albumentations.Compose(
131
+ [
132
+ albumentations.CenterCrop(size, size),
133
+ albumentations.Normalize(always_apply=True),
134
+ ]
135
+ )
136
+ images = []
137
+ for image in image_input["image"]:
138
+ augmented = test_transforms(image=np.array(image))["image"]
139
+ transposed = np.transpose(augmented, (2, 0, 1)).astype(np.float32)
140
+ images.append(torch.tensor(transposed, dtype=torch.float32))
141
+ image_input["image"] = images
142
+ return image_input
143
+
144
+
145
+ partition_test = partition_test.with_transform(apply_test_transforms,
146
+ columns="image")
147
+ ```
148
+
149
  ## Dataset Structure
150
 
151
  ### Data Instances