Integrating TF Serving with TF Adapter
The following uses installation user HwHiAiUser as an example to describe how to integrate TF Adapter into TF Serving. Replace it with the actual username. Ensure that the installation user has the read, or read and write permissions on the paths described in this document.
- Download the TF Serving source code.
The version of TF Serving must be the same as that of TensorFlow. Upload the source package to any directory on the server.
- Go to the directory where the source package is stored and run the following commands to decompress and access the TF Serving source package:
1 2
unzip 1.15.0.zip cd serving-1.15.0/
- Add third-party dependency packages of TF Serving.
- Create the tf_adapter folder in the serving-1.15.0/third_party directory, and go to the folder.
1 2 3
cd third_party/ mkdir tf_adapter cd tf_adapter
- Copy the libpython3.7m.so.1.0 file to the tf_adapter folder, and create a soft link.
1 2
cp /usr/local/python3.7.5/lib/libpython3.7m.so.1.0 . ln -s libpython3.7m.so.1.0 libpython3.7m.so
- Copy the _tf_adapter.so file to the tf_adapter folder, and change the file name from _tf_adapter.so to lib_tf_adapter.so.
1 2
cp ${TFPLUGIN_INSTALL_PATH}/npu_bridge/_tf_adapter.so . mv _tf_adapter.so lib_tf_adapter.so
${TFPLUGIN_INSTALL_PATH} is the installation path of the TF Adapter package.
- Create the tf_adapter folder in the serving-1.15.0/third_party directory, and go to the folder.
- Compile empty libtensorflow_framework.so and _pywrap_tensorflow_internal.so files.
- Run the following command in the tf_adapter folder:
1vim CMakeLists.txt - Add the following content, and save the file.
file(TOUCH ${CMAKE_CURRENT_BINARY_DIR}/stub.c) add_library(_pywrap_tensorflow_internal SHARED ${CMAKE_CURRENT_BINARY_DIR}/stub.c) add_library(tensorflow_framework SHARED ${CMAKE_CURRENT_BINARY_DIR}/stub.c) - Run the :wq! command to save the file and exit.
- Compile empty .so files.
1 2 3 4 5 6 7 8
mkdir temp cd temp cmake .. make mv lib_pywrap_tensorflow_internal.so ../_pywrap_tensorflow_internal.so mv libtensorflow_framework.so ../libtensorflow_framework.so cd .. ln -s libtensorflow_framework.so libtensorflow_framework.so.1
- Configure the environment variable.
1export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$(pwd)
- Run the following command in the tf_adapter folder:
- Create the compilation file, and add the following content:
- Run the following commands to create a BUILD file in the tf_adapter folder:
vim BUILD
- Add the following content:
licenses(["notice"]) # BSD/MIT. cc_import( name = "tf_adapter", shared_library = "lib_tf_adapter.so", visibility = ["//visibility:public"] ) cc_import( name = "tf_python", shared_library = "libpython3.7m.so", visibility = ["//visibility:public"] ) - Run the :wq! command to save the file and exit.
- Run the following commands to create a BUILD file in the tf_adapter folder:
- Add the following content in bold to cc_binary in the BUILD file under the serving-1.15.0/tensorflow_serving/model_servers/ directory:
cc_binary( name = "tensorflow_model_server", stamp = 1, visibility = [ ":testing", "//tensorflow_serving:internal", ], deps = [ ":tensorflow_model_server_main_lib", "//third_party/tf_adapter:tf_adapter", "//third_party/tf_adapter:tf_python", "@org_tensorflow//tensorflow/compiler/jit:xla_cpu_jit", ], ) - Perform TF Serving compilation.
Run the following command in the TF Serving installation directory serving-1.15.0 to compile TF Serving:
1bazel --output_user_root=/opt/tf_serving build -c opt --cxxopt="-D_GLIBCXX_USE_CXX11_ABI=0" tensorflow_serving/model_servers:tensorflow_model_server
The –output_user_root option specifies the installation path of TF Serving. Set it based on the actual situation.
- If downloading of a dependency package fails, manually download it by referring to Manually Downloading Dependency Packages for TF Serving Compilation.
- If querying of builtins dependency modules fails, rectify the fault by referring to What Should I Do If an Error About builtins Missing Is Displayed During TF Serving Compilation?.
- Create a soft link.
The command used for creating a soft link is as follows:
1ln -s /opt/tf_serving/{tf_serving_ID}/execroot/tf_serving/bazel-out/xxx-opt/bin/tensorflow_serving/model_servers/tensorflow_model_server /usr/local/bin/tensorflow_model_server
- {tf_serving_ID} is a string of irregular characters, for example, 063944eceea3e72745362a0b6eb12a3c. Set it based on the actual situation.
- The xxx-opt folder is automatically generated by the tool. Replace it with the actual folder name.