Keras的TensorFlow Hub

    TensorFlow中心是一种共享预训练模型组件的方法。查看TensorFlow模块Hub获取预训练模型的可搜索列表。本教程演示了:

    1. 如何在Keras中使用TensorFlow Hub。
    2. 如何使用TensorFlow Hub进行图像分类。
    3. 如何做简单的迁移学习。

    设置

    图书馆(keras)图书馆(tfhub)图书馆(魔法)#>链接到ImageMagick启用功能:cairo, fontconfig, freetype, lcm, pango, rsvg, webp禁用功能:fftw, ghostscript, x11

    一个ImageNet分类器

    下载分类器

    使用layer_hub加载移动网络并将其转换为Keras层。任何来自tfhub.dev的TensorFlow 2兼容的图像分类器URL都可以在这里工作。

    classifier_url < -“https://tfhub.dev/google/tf2-preview/mobilenet_v2/classification/2”mobilenet_layer < -layer_hub处理=classifier_url)# ># >完成了!

    然后我们可以创建Keras模型:

    输入< -layer_input形状=c2242243.))输出< -输入% > %mobilenet_layer()模型< -keras_model(输入、输出)

    在单个图像上运行它

    下载一张图片来试用模型。

    img < -image_read“https://storage.googleapis.com/download.tensorflow.org/example_images/grace_hopper.jpg”% > %image_resize几何=“224 x224x3 !”% > %image_data()% > %as.numeric()% > %abind::abind沿着=0#展开为批处理维度

    结果< -预测(模型、img)mobilenet_decode_predictions(结果,-1滴=])# > [[1]]#> class_name class_description评分#> 1 n03763968 military_uniform 9.355025#> 2 n03787032学士帽5.400680#> 3 n02817516熊皮5.297816#> 4 n04350905套装5.200010#> 5 n09835506球员4.792098

    学习简单的转移

    使用TF Hub可以很容易地重新训练模型的顶层,以识别数据集中的类。

    数据集

    在这个例子中,你将使用TensorFlow花数据集:

    data_root < -::“https://storage.googleapis.com/download.tensorflow.org/example_images/flower_photos.tgz”“flower_photos”data_root < -fs::path_dir(fs::path_dir(data_root [One hundred.)))#向下2层

    将此数据加载到模型的最简单方法是使用image_data_generator

    所有TensorFlow Hub的图像模块都期望[0,1]范围内的浮点输入。使用image_data_generator的rescale参数来实现这一点。

    image_generator < -image_data_generator重新调节=1/255validation_split =0.2training_data < -flow_images_from_directory目录=data_root,发电机=image_generator,target_size =c224224),子集=“培训”发现2939张图片属于5个类。validation_data < -flow_images_from_directory目录=data_root,发电机=image_generator,target_size =c224224),子集=“确认”发现731张图片属于5个类。

    结果对象是一个返回的迭代器image_batchlabel_batch双

    下载无头模型

    TensorFlow Hub也分发没有顶级分类层的模型。这些可以用来轻松地进行迁移学习。

    来自tfhub.dev的任何Tensorflow 2兼容的图像特征向量URL都可以在这里工作。

    feature_extractor_url < -“https://tfhub.dev/google/tf2-preview/mobilenet_v2/feature_vector/2”feature_extractor_layer < -layer_hub处理=feature_extractor_url)

    附上分类标头

    现在,我们可以通过将分类头附加到特征提取层来创建分类模型。我们定义了以下模型:

    输入< -layer_input形状=c2242243.))输出< -输入% > %feature_extractor_layer()% > %layer_dense单位=training_datanum_classes,激活=“softmax”模型< -keras_model(输入、输出)总结(模型)# >模型:“model_1”#> ________________________________________________________________________________#>图层输出形状参数##> ================================================================================#> input_2 (InputLayer) [(None, 224,224,3)] 0#> ________________________________________________________________________________#> keras_layer_1 (KerasLayer)(无,1280)2257984#> ________________________________________________________________________________#> dense (dense) (None, 5) 6405#> ================================================================================总参数:2,264,389#>可训练参数:6405#>不可训练参数:2,257,984#> ________________________________________________________________________________

    火车模型

    我们现在可以像训练其他Keras模型一样训练我们的模型。我们第一次使用编译配置培训流程:

    模型% > %编译损失=“categorical_crossentropy”优化器=“亚当”指标=“acc”

    然后我们可以使用适合函数来适应我们的模型。

    模型% > %fit_generatortraining_data,steps_per_epoch =training_datan/training_databatch_size,validation_data =validation_data# >1/91[..............................]-埃塔707-损失1.8092-acc0.21882/91[..............................]-埃塔508-损失1.8743-acc0.17193./91[..............................]-埃塔455-损失1.8324-acc0.17714/91>.............................]-埃塔429-损失1.7727-acc0.21885/91>.............................]-埃塔417-损失1.7390-acc0.23756/91>.............................]-埃塔402-损失1.6711-acc0.28127/91(=>............................]-埃塔3.52-损失1.6428-acc0.29468/91(=>............................]-埃塔3.42-损失1.6052-acc0.32429/91(=>............................]-埃塔3.33-损失1.5795-acc0.333310/91= =>...........................]-埃塔3.27-损失1.5399-acc0.343811/91= =>...........................]-埃塔3.22-损失1.5016-acc0.366512/91= =>...........................]-埃塔3.18-损失1.4670-acc0.385413/91= == >..........................]-埃塔3.15-损失1.4373-acc0.406214/91= == >..........................]-埃塔3.14-损失1.3955-acc0.428615/91= == >..........................]-埃塔3.12-损失1.3622-acc0.447916/91= ===>.........................]-埃塔3.09-损失1.3322-acc0.459017/91= ===>.........................]-埃塔3.06-损失1.3177-acc0.465118/91= ===>.........................]-埃塔3.03-损失1.2965-acc0.477419/91= == = = >........................]-埃塔259-损失1.2761-acc0.490120./91= == = = >........................]-埃塔255-损失1.2566-acc0.496921/91= == = = >........................]-埃塔251-损失1.2477-acc0.500022/91= == = = = >.......................]-埃塔248-损失1.2270-acc0.507123/91= == = = = >.......................]-埃塔246-损失1.2074-acc0.514924/91= == = = = >.......................]-埃塔245-损失1.1892-acc0.523425/91= == = = = = >......................]-埃塔242-损失1.1740-acc0.530026/91= == = = = = >......................]-埃塔240-损失1.1698-acc0.528827/91= == = = = = >......................]-埃塔238-损失1.1517-acc0.537028/91= == = = = = = >.....................]-埃塔236-损失1.1376-acc0.543529/91= == = = = = = >.....................]-埃塔233-损失1.1258-acc0.550630./91= == = = = = = >.....................]-埃塔231-损失1.1093-acc0.560431/91= == = = = = = = >....................]-埃塔228-损失1.0957-acc0.565532/91= == = = = = = = >....................]-埃塔225-损失1.0895-acc0.570333/91= == = = = = = = >....................]-埃塔222-损失1.0769-acc0.575834/91= == = = = = = = = >...................]-埃塔220.-损失1.0666-acc0.580935/91= == = = = = = = = >...................]-埃塔217-损失1.0581-acc0.584836/91= == = = = = = = = >...................]-埃塔214-损失1.0487-acc0.588537/91= ==========>..................]-埃塔214-损失1.0448-acc0.591238/91= ==========>..................]-埃塔211-损失1.0406-acc0.590439/91= ==========>..................]-埃塔208-损失1.0314-acc0.594540/91= ===========>.................]-埃塔205-损失1.0197-acc0.599241/91= ===========>.................]-埃塔202-损失1.0089-acc0.603742/91= ===========>.................]-埃塔200-损失0.9983-acc0.610243/91= ============>................]-埃塔157-损失0.9919-acc0.614244/91= ============>................]-埃塔154-损失0.9812-acc0.618745/91= ============>................]-埃塔152-损失0.9686-acc0.626546/91= =============>...............]-埃塔150-损失0.9608-acc0.629947/91= =============>...............]-埃塔147-损失0.9559-acc0.632448/91= =============>...............]-埃塔144-损失0.9480-acc0.634949/91= ==============>..............]-埃塔141-损失0.9416-acc0.637950/91= ==============>..............]-埃塔139-损失0.9355-acc0.641451/91= ==============>..............]-埃塔136-损失0.9256-acc0.646052/91= ===============>.............]-埃塔133-损失0.9165-acc0.649853/91= ===============>.............]-埃塔131-损失0.9116-acc0.651754/91= ===============>.............]-埃塔128-损失0.9029-acc0.654755/91= ================>............]-埃塔126-损失0.8985-acc0.656456/91= ================>............]-埃塔123-损失0.8906-acc0.660357/91= ================>............]-埃塔121-损失0.8815-acc0.664758/91= =================>...........]-埃塔118-损失0.8734-acc0.669459/91= =================>...........]-埃塔116-损失0.8679-acc0.671860/91= =================>...........]-埃塔114-损失0.8637-acc0.673661/91= ==================>..........]-埃塔111-损失0.8552-acc0.678062/91= ==================>..........]-埃塔109-损失0.8562-acc0.676663/91= ==================>..........]-埃塔106-损失0.8476-acc0.680364/91= ===================>.........]-埃塔104-损失0.8447-acc0.681465/91= ===================>.........]-埃塔102-损失0.8403-acc0.682466/91= ===================>.........]-埃塔59岁的年代-损失0.8321-acc0.685867/91= ====================>……)-埃塔57 s-损失0.8249-acc0.689168/91= ====================>……)-埃塔55岁的年代-损失0.8221-acc0.690969/91= ====================>……)-埃塔53个年代-损失0.8202-acc0.691370/91= =====================>……)-埃塔50年代-损失0.8141-acc0.694471/91= =====================>……)-埃塔48个年代-损失0.8113-acc0.695272/91= =====================>……)-埃塔46个年代-损失0.8052-acc0.698173/91= ======================>……)-埃塔44岁的年代-损失0.8003-acc0.700674/91= ======================>……)-埃塔41岁的年代-损失0.7977-acc0.702575/91= ======================>……)-埃塔39年代-损失0.7908-acc0.705276/91= =======================>…]-埃塔37个年代-损失0.7844-acc0.708777/91= =======================>…]-埃塔34岁的年代-损失0.7796-acc0.710978/91= =======================>…]-埃塔32岁的年代-损失0.7757-acc0.712279/91= ========================>…]-埃塔29岁的年代-损失0.7704-acc0.714280/91= ========================>…]-埃塔27岁的年代-损失0.7683-acc0.714381/91= ========================>…]-埃塔25岁的年代-损失0.7649-acc0.715182/91= =========================>…]-埃塔22个年代-损失0.7597-acc0.717183/91= =========================>…]-埃塔20年代-损失0.7551-acc0.719484/91= =========================>…]-埃塔17世纪-损失0.7523-acc0.720185/91= ==========================>. .)-埃塔15秒-损失0.7490-acc0.721586/91= ==========================>. .)-埃塔12个年代-损失0.7444-acc0.723387/91= ==========================>. .)-埃塔9世纪-损失0.7397-acc0.725888/91= ===========================>)-埃塔7 s-损失0.7385-acc0.726489/91= ===========================>)-埃塔4 s-损失0.7350-acc0.727890/91= ===========================>)-埃塔2 s-损失0.7313-acc0.729091/91= =============================-239年代3 s/一步-损失0.7272-acc0.7303-val_loss0.4682-val_acc0.8372

    然后您可以使用以下命令导出您的模型:

    save_model_tf(模型中,“模型”

    还可以重新加载model_from_saved_model函数。注意,您需要通过custom_object因为它不是默认的Keras层。

    reloaded_model < -load_model_tf“模型”

    我们可以验证训练后的模型和加载后的模型的预测结果是相等的:

    步骤< -as.integer(validation_datan/validation_databatch_size)all.equalpredict_generator(模型、validation_data步骤=步骤),predict_generator(validation_data reloaded_model步骤=步骤),# > [1]

    保存的模型也可以被加载以便稍后进行推断,或者被转换为TFLiteTFjs