We typically will deploy this locally. To bring everything up, you need docker and docker-compose.
If you didn't do it already, you will need a network.
docker network create nginx-net
make server-compose
We pass through tensorflow serving at this url:
/model/metadata
If the model input names change, then we need to change it in the code
Input layers' names should be "input_2048" and "input_4096"
Output layer's name should be "output"
Classify programmatically
/classify?smiles=<>
You can also provide cached flag to the params to get the cached version so make it faster
The license as included for the software is MIT. Additionally, all data, models, and ontology are licensed as CC0.
We try our best to balance privacy and understand how users are using our tool. As such, we keep in our logs which structures were classified but not which users queried the structure.