Skip to content

Commit 3ac504b

Browse files
authored
Add What-If Tool developers guide doc (#2630)
1 parent ec276ce commit 3ac504b

File tree

2 files changed

+119
-0
lines changed

2 files changed

+119
-0
lines changed
Lines changed: 115 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,115 @@
1+
# What-If Tool Development Guide
2+
3+
## First-time Setup
4+
5+
1. Install [Bazel](https://docs.bazel.build/versions/master/install.html)
6+
(for building OSS code) and [Docker](https://docs.docker.com/install/)
7+
(for hosting TF models using [TensorFlow Serving](https://github.com/tensorflow/serving)
8+
when testing WIT in TensorBoard).
9+
2. Install pip and virtualenv
10+
`sudo apt-get install python-pip python3-pip virtualenv`
11+
3. Create a virtualenv for OSS TensorBoard development
12+
`virtualenv ~/tf` (or whereever you want to save this environment)
13+
4. Create a fork of the official TensorBoard github repo through the GitHub UI
14+
5. Clone your fork to your computer
15+
`cd ~/github && git clone https://github.com/[yourGitHubUsername]/tensorboard.git`
16+
6. Install TensorFlow Serving through docker
17+
`docker pull tensorflow/serving`
18+
19+
## Development Workflow
20+
21+
1. Enter your TF/TB development virtualenv
22+
`source ~/tf/bin/activate`
23+
2. Run TensorBoard, WIT notebooks, and/or WIT demos
24+
`cd ~/github/tensorboard`
25+
- For WIT demos, follow the directions in the [README](./README.md#i-dont-want-to-read-this-document-can-i-just-play-with-a-demo).
26+
1. `bazel run tensorboard/plugins/interactive_inference/tf_interactive_inference_dashboard/demo:<demoRule>`
27+
2. Navigate to `http://localhost:6006/tf-interactive-inference-dashboard/<demoName>.html`
28+
- For use in notebook mode, build the witwidget pip package locally and use it in a notebook.
29+
1. `rm -rf /tmp/wit-pip` (if it already exists)
30+
2. `bazel run tensorboard/plugins/interactive_inference/witwidget/pip_package:build_pip_package`
31+
3. Install the package
32+
- For use in Jupyter notebooks, install and enable the locally-build pip package per instructions in the [README](./README.md#how-do-i-enable-it-for-use-in-a-jupyter-notebook), but instead use `pip install <pathToBuiltPipPackageWhlFile>`, then launch the jupyter notebook kernel.
33+
- For use in Colab notebooks, upload the package to the notebook and install it from there
34+
1. In a notebook cell, to upload a file from local disk, run
35+
```
36+
from google.colab import file
37+
uploaded = files.upload()
38+
```
39+
2. In a notebook cell, to install the uploaded pip package, run `!pip install <nameOfPackage.whl>`.
40+
If witwidget was previously installed, uninstall it first.<br>
41+
- For TensorBoard use, run tensorboard with any logdir (as WIT does not rely on logdir).<br>
42+
`bazel run tensorboard -- --logdir /tmp`
43+
1. WIT needs a served model to query, so serve your trained model through the TF serving docker container.<br>
44+
`sudo docker run -p 8500:8500 --mount type=bind,source=<pathToSavedModel>,target=/models/my_model/ -e MODEL_NAME=my_model -t tensorflow/serving`
45+
- When developing model comparison, serve multiple models at once using the proper config as seen in the appendix.<br>
46+
`sudo docker run -p 8500:8500 --mount type=bind,source=<pathToSavedModel1>,target=/models/my_model1 -e When you want to shutdown the served model, find the container ID and stop the container.MODEL_NAME=my_model_1 --mount type=bind,source=<pathToSavedModel2>,target=/models/my_model_2 -e MODEL_NAME=my_model_2 When you want to shutdown the served model, find the container ID and stop the container.--mount type=bind,source=<pathToConfigFile>,target=/models/models.config -t tensorflow/serving --model_config_file="/models/models.config"`
47+
2. Navigate to the WIT tab in TensorBoard and set-up WIT (`http://localhost:6006/#whatif&inferenceAddress=localhost%3A8500&modelName=my_model`).<br>
48+
The inferenceAddress and modelName settings point to the model you served in the previous step. Set all other appropriate options and click “accept”.
49+
3. When you want to shutdown the served model, find the container ID and stop the container.
50+
```
51+
sudo docker container ls
52+
sudo docker stop <containerIdFromLsOutput>
53+
```
54+
3. The python code has unit tests
55+
```
56+
bazel test tensorboard/plugins/interactive_inference/...
57+
```
58+
4. Add/commit your code changes on a branch in your fork and push it to github.
59+
5. In the github UI for the master tensorboard repo, create a pull request from your pushed branch.
60+
61+
## Code Overview
62+
63+
### Backend (Python)
64+
65+
[interactive_inference_plugin.py](interactive_inference_plugin.py) - the python web backend code for the WIT plugin to TensorBoard. Handles requests from the browser (like load examples, infer examples, …). Loads data from disk. Sends inference requests to servo. Sends responses back to the browser.<br>
66+
[interactive_inference_plugin_test.py]() - UT<br>
67+
68+
[utils/common_utils.py](./utils/common_utils.py) - utilities common to other python files<br>
69+
[utils/inference_utils.py](./utils/inference_utils.py) - utility functions for running inference requests through a model<br>
70+
[utils/inference_utils_test.py](./utils/inference_utils_test.py) - UT<br>
71+
[utils/platform_utils.py](./utils/platform_utils.py) - functions specific to the open-source implementation (loading examples from disk, calling to servo)<br>
72+
[utils/test_utils.py](./utils/test_utils.py) - helper functions for UTs<br>
73+
74+
[witwidget/notebook/base.py](witwidget/notebook/base.py) - WitWidgetBase class definition for using WIT in notebooks. Shared base class for both jupyter and colab implementations<br>
75+
[witwidget/notebook/visualization.py](witwidget/notebook/visualization.py) - WitConfigBuilder class definition for using WIT in notebooks<br>
76+
77+
[witwidget/notebook/colab/wit.py](witwidget/notebook/colab/wit.py) - backend for running in colab, along with front-end glue code to display WIT in colab<br>
78+
79+
[witwidget/notebook/jupyter/wit.py](witwidget/notebook/jupyter/wit.py) - backend for running in jupyter<br>
80+
[witwidget/notebook/jupyter/js/lib/wit.js](witwidget/notebook/jupyter/js/lib/wit.js) - front-end glue code to display WIT in jupyter<br>
81+
82+
### Front-end
83+
84+
[tf_interactive_inference_dashboard/tf-interactive-inference-dashboard.html](tf_interactive_inference_dashboard/tf-interactive-inference-dashboard.html) - top-level polymer element and most of the code for the WIT front-end<br>
85+
[tf_interactive_inference_dashboard/tf-confusion-matrix.html](tf_interactive_inference_dashboard/tf-confusion-matrix.html) - polymer element for the confusion matrix<br>
86+
[tf_interactive_inference_dashboard/tf-inference-panel.html](tf_interactive_inference_dashboard/tf-inference-panel.html) - polymer element for the set-up controls<br>
87+
[tf_interactive_inference_dashboard/tf-inference-viewer.html](tf_interactive_inference_dashboard/tf-inference-viewer.html) - polymer element for the inference results table<br>
88+
89+
### Demos
90+
91+
[tf_interactive_inference_dashboard/demo/tf-interactive-inference-*-demo.html](tf_interactive_inference_dashboard/demo/) - the code for the standalone web demos of WIT that load a tensorflow.js model and some data from json and runs WIT<br>
92+
93+
### Miscellaneous
94+
95+
[tensorboard/components/vz_example_viewer/vz-example-viewer.*](https://https://github.com/tensorflow/tensorboard/tree/master/tensorboard/components/vz_example_viewer) - polymer element for the individual example viewer/editor<br>
96+
97+
## Appendix
98+
99+
### Serving multiple models: models.config contents
100+
101+
```
102+
model_config_list: {
103+
104+
config: {
105+
name: "my_model_1",
106+
base_path: "/models/my_model_1",
107+
model_platform: "tensorflow"
108+
},
109+
config: {
110+
name: "my_model_2",
111+
base_path: "/models/my_model_2",
112+
model_platform: "tensorflow"
113+
}
114+
}
115+
```

tensorboard/plugins/interactive_inference/README.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -328,3 +328,7 @@ For TensorFlow GPU support, use the `witwidget-gpu` package instead of `witwidge
328328
Note that you may need to run `!sudo jupyter labextension ...` commands depending on your notebook setup.
329329

330330
Use of WIT after installation is the same as with the other notebook installations.
331+
332+
## How can I help develop it?
333+
334+
Check out the [developement guide](./DEVELOPMENT.md).

0 commit comments

Comments
 (0)