Keras的TensorFlow Hub
TensorFlow中心是一种共享预训练模型组件的方法。查看TensorFlow模块Hub获取预训练模型的可搜索列表。本教程演示了:
- 如何在Keras中使用TensorFlow Hub。
- 如何使用TensorFlow Hub进行图像分类。
- 如何做简单的迁移学习。
设置
一个ImageNet分类器
下载分类器
使用layer_hub
加载移动网络并将其转换为Keras层。任何来自tfhub.dev的TensorFlow 2兼容的图像分类器URL都可以在这里工作。
classifier_url < -“https://tfhub.dev/google/tf2-preview/mobilenet_v2/classification/2”mobilenet_layer < -layer_hub(处理=classifier_url)# ># >完成了!
然后我们可以创建Keras模型:
输入< -layer_input(形状=c(224,224,3.))输出< -输入% > %mobilenet_layer()模型< -keras_model(输入、输出)
在单个图像上运行它
下载一张图片来试用模型。
img < -image_read(“https://storage.googleapis.com/download.tensorflow.org/example_images/grace_hopper.jpg”)% > %image_resize(几何=“224 x224x3 !”)% > %image_data()% > %as.numeric()% > %abind::abind(沿着=0)#展开为批处理维度
结果< -预测(模型、img)mobilenet_decode_predictions(结果,-1,滴=假])# > [[1]]#> class_name class_description评分#> 1 n03763968 military_uniform 9.355025#> 2 n03787032学士帽5.400680#> 3 n02817516熊皮5.297816#> 4 n04350905套装5.200010#> 5 n09835506球员4.792098
学习简单的转移
使用TF Hub可以很容易地重新训练模型的顶层,以识别数据集中的类。
数据集
在这个例子中,你将使用TensorFlow花数据集:
data_root < -针::销(“https://storage.googleapis.com/download.tensorflow.org/example_images/flower_photos.tgz”,“flower_photos”)data_root < -fs::path_dir(fs::path_dir(data_root [One hundred.)))#向下2层
将此数据加载到模型的最简单方法是使用image_data_generator
所有TensorFlow Hub的图像模块都期望[0,1]范围内的浮点输入。使用image_data_generator的rescale参数来实现这一点。
image_generator < -image_data_generator(重新调节=1/255,validation_split =0.2)training_data < -flow_images_from_directory(目录=data_root,发电机=image_generator,target_size =c(224,224),子集=“培训”)发现2939张图片属于5个类。validation_data < -flow_images_from_directory(目录=data_root,发电机=image_generator,target_size =c(224,224),子集=“确认”)发现731张图片属于5个类。
结果对象是一个返回的迭代器image_batch
,label_batch双
.
下载无头模型
TensorFlow Hub也分发没有顶级分类层的模型。这些可以用来轻松地进行迁移学习。
来自tfhub.dev的任何Tensorflow 2兼容的图像特征向量URL都可以在这里工作。
feature_extractor_url < -“https://tfhub.dev/google/tf2-preview/mobilenet_v2/feature_vector/2”feature_extractor_layer < -layer_hub(处理=feature_extractor_url)
附上分类标头
现在,我们可以通过将分类头附加到特征提取层来创建分类模型。我们定义了以下模型:
输入< -layer_input(形状=c(224,224,3.))输出< -输入% > %feature_extractor_layer()% > %layer_dense(单位=training_data$num_classes,激活=“softmax”)模型< -keras_model(输入、输出)总结(模型)# >模型:“model_1”#> ________________________________________________________________________________#>图层输出形状参数##> ================================================================================#> input_2 (InputLayer) [(None, 224,224,3)] 0#> ________________________________________________________________________________#> keras_layer_1 (KerasLayer)(无,1280)2257984#> ________________________________________________________________________________#> dense (dense) (None, 5) 6405#> ================================================================================总参数:2,264,389#>可训练参数:6405#>不可训练参数:2,257,984#> ________________________________________________________________________________
火车模型
我们现在可以像训练其他Keras模型一样训练我们的模型。我们第一次使用编译
配置培训流程:
模型% > %编译(损失=“categorical_crossentropy”,优化器=“亚当”,指标=“acc”)
然后我们可以使用适合
函数来适应我们的模型。
模型% > %fit_generator(training_data,steps_per_epoch =training_data$n/training_data$batch_size,validation_data =validation_data)# >1/91[..............................]-埃塔:7:07-损失:1.8092-acc:0.21882/91[..............................]-埃塔:5:08-损失:1.8743-acc:0.17193./91[..............................]-埃塔:4:55-损失:1.8324-acc:0.17714/91[>.............................]-埃塔:4:29-损失:1.7727-acc:0.21885/91[>.............................]-埃塔:4:17-损失:1.7390-acc:0.23756/91[>.............................]-埃塔:4:02-损失:1.6711-acc:0.28127/91(=>............................]-埃塔:3.:52-损失:1.6428-acc:0.29468/91(=>............................]-埃塔:3.:42-损失:1.6052-acc:0.32429/91(=>............................]-埃塔:3.:33-损失:1.5795-acc:0.333310/91[= =>...........................]-埃塔:3.:27-损失:1.5399-acc:0.343811/91[= =>...........................]-埃塔:3.:22-损失:1.5016-acc:0.366512/91[= =>...........................]-埃塔:3.:18-损失:1.4670-acc:0.385413/91[= == >..........................]-埃塔:3.:15-损失:1.4373-acc:0.406214/91[= == >..........................]-埃塔:3.:14-损失:1.3955-acc:0.428615/91[= == >..........................]-埃塔:3.:12-损失:1.3622-acc:0.447916/91[= ===>.........................]-埃塔:3.:09-损失:1.3322-acc:0.459017/91[= ===>.........................]-埃塔:3.:06-损失:1.3177-acc:0.465118/91[= ===>.........................]-埃塔:3.:03-损失:1.2965-acc:0.477419/91[= == = = >........................]-埃塔:2:59-损失:1.2761-acc:0.490120./91[= == = = >........................]-埃塔:2:55-损失:1.2566-acc:0.496921/91[= == = = >........................]-埃塔:2:51-损失:1.2477-acc:0.500022/91[= == = = = >.......................]-埃塔:2:48-损失:1.2270-acc:0.507123/91[= == = = = >.......................]-埃塔:2:46-损失:1.2074-acc:0.514924/91[= == = = = >.......................]-埃塔:2:45-损失:1.1892-acc:0.523425/91[= == = = = = >......................]-埃塔:2:42-损失:1.1740-acc:0.530026/91[= == = = = = >......................]-埃塔:2:40-损失:1.1698-acc:0.528827/91[= == = = = = >......................]-埃塔:2:38-损失:1.1517-acc:0.537028/91[= == = = = = = >.....................]-埃塔:2:36-损失:1.1376-acc:0.543529/91[= == = = = = = >.....................]-埃塔:2:33-损失:1.1258-acc:0.550630./91[= == = = = = = >.....................]-埃塔:2:31-损失:1.1093-acc:0.560431/91[= == = = = = = = >....................]-埃塔:2:28-损失:1.0957-acc:0.565532/91[= == = = = = = = >....................]-埃塔:2:25-损失:1.0895-acc:0.570333/91[= == = = = = = = >....................]-埃塔:2:22-损失:1.0769-acc:0.575834/91[= == = = = = = = = >...................]-埃塔:2:20.-损失:1.0666-acc:0.580935/91[= == = = = = = = = >...................]-埃塔:2:17-损失:1.0581-acc:0.584836/91[= == = = = = = = = >...................]-埃塔:2:14-损失:1.0487-acc:0.588537/91[= ==========>..................]-埃塔:2:14-损失:1.0448-acc:0.591238/91[= ==========>..................]-埃塔:2:11-损失:1.0406-acc:0.590439/91[= ==========>..................]-埃塔:2:08-损失:1.0314-acc:0.594540/91[= ===========>.................]-埃塔:2:05-损失:1.0197-acc:0.599241/91[= ===========>.................]-埃塔:2:02-损失:1.0089-acc:0.603742/91[= ===========>.................]-埃塔:2:00-损失:0.9983-acc:0.610243/91[= ============>................]-埃塔:1:57-损失:0.9919-acc:0.614244/91[= ============>................]-埃塔:1:54-损失:0.9812-acc:0.618745/91[= ============>................]-埃塔:1:52-损失:0.9686-acc:0.626546/91[= =============>...............]-埃塔:1:50-损失:0.9608-acc:0.629947/91[= =============>...............]-埃塔:1:47-损失:0.9559-acc:0.632448/91[= =============>...............]-埃塔:1:44-损失:0.9480-acc:0.634949/91[= ==============>..............]-埃塔:1:41-损失:0.9416-acc:0.637950/91[= ==============>..............]-埃塔:1:39-损失:0.9355-acc:0.641451/91[= ==============>..............]-埃塔:1:36-损失:0.9256-acc:0.646052/91[= ===============>.............]-埃塔:1:33-损失:0.9165-acc:0.649853/91[= ===============>.............]-埃塔:1:31-损失:0.9116-acc:0.651754/91[= ===============>.............]-埃塔:1:28-损失:0.9029-acc:0.654755/91[= ================>............]-埃塔:1:26-损失:0.8985-acc:0.656456/91[= ================>............]-埃塔:1:23-损失:0.8906-acc:0.660357/91[= ================>............]-埃塔:1:21-损失:0.8815-acc:0.664758/91[= =================>...........]-埃塔:1:18-损失:0.8734-acc:0.669459/91[= =================>...........]-埃塔:1:16-损失:0.8679-acc:0.671860/91[= =================>...........]-埃塔:1:14-损失:0.8637-acc:0.673661/91[= ==================>..........]-埃塔:1:11-损失:0.8552-acc:0.678062/91[= ==================>..........]-埃塔:1:09-损失:0.8562-acc:0.676663/91[= ==================>..........]-埃塔:1:06-损失:0.8476-acc:0.680364/91[= ===================>.........]-埃塔:1:04-损失:0.8447-acc:0.681465/91[= ===================>.........]-埃塔:1:02-损失:0.8403-acc:0.682466/91[= ===================>.........]-埃塔:59岁的年代-损失:0.8321-acc:0.685867/91[= ====================>……)-埃塔:57 s-损失:0.8249-acc:0.689168/91[= ====================>……)-埃塔:55岁的年代-损失:0.8221-acc:0.690969/91[= ====================>……)-埃塔:53个年代-损失:0.8202-acc:0.691370/91[= =====================>……)-埃塔:50年代-损失:0.8141-acc:0.694471/91[= =====================>……)-埃塔:48个年代-损失:0.8113-acc:0.695272/91[= =====================>……)-埃塔:46个年代-损失:0.8052-acc:0.698173/91[= ======================>……)-埃塔:44岁的年代-损失:0.8003-acc:0.700674/91[= ======================>……)-埃塔:41岁的年代-损失:0.7977-acc:0.702575/91[= ======================>……)-埃塔:39年代-损失:0.7908-acc:0.705276/91[= =======================>…]-埃塔:37个年代-损失:0.7844-acc:0.708777/91[= =======================>…]-埃塔:34岁的年代-损失:0.7796-acc:0.710978/91[= =======================>…]-埃塔:32岁的年代-损失:0.7757-acc:0.712279/91[= ========================>…]-埃塔:29岁的年代-损失:0.7704-acc:0.714280/91[= ========================>…]-埃塔:27岁的年代-损失:0.7683-acc:0.714381/91[= ========================>…]-埃塔:25岁的年代-损失:0.7649-acc:0.715182/91[= =========================>…]-埃塔:22个年代-损失:0.7597-acc:0.717183/91[= =========================>…]-埃塔:20年代-损失:0.7551-acc:0.719484/91[= =========================>…]-埃塔:17世纪-损失:0.7523-acc:0.720185/91[= ==========================>. .)-埃塔:15秒-损失:0.7490-acc:0.721586/91[= ==========================>. .)-埃塔:12个年代-损失:0.7444-acc:0.723387/91[= ==========================>. .)-埃塔:9世纪-损失:0.7397-acc:0.725888/91[= ===========================>)-埃塔:7 s-损失:0.7385-acc:0.726489/91[= ===========================>)-埃塔:4 s-损失:0.7350-acc:0.727890/91[= ===========================>)-埃塔:2 s-损失:0.7313-acc:0.729091/91[= =============================]-239年代3 s/一步-损失:0.7272-acc:0.7303-val_loss:0.4682-val_acc:0.8372
然后您可以使用以下命令导出您的模型:
save_model_tf(模型中,“模型”)
还可以重新加载model_from_saved_model
函数。注意,您需要通过custom_object
因为它不是默认的Keras层。
reloaded_model < -load_model_tf(“模型”)
我们可以验证训练后的模型和加载后的模型的预测结果是相等的:
步骤< -as.integer(validation_data$n/validation_data$batch_size)all.equal(predict_generator(模型、validation_data步骤=步骤),predict_generator(validation_data reloaded_model步骤=步骤),)# > [1]