Getting Started with Booklet.ai

It takes just a couple of minutes to connect Booklet.ai to your deployed ML Model. Once connected, Booklet.ai serves a web app, HTTP API, and enables you to integrate the model with your software.

Follow this guide to connect your first model to Booklet.ai.

Prerequisites

  • A Booklet.ai account. Your first model is free. Signup to get started.
  • Using AWS SageMaker? Follow our instructions to grant Booklet.ai read-only Sagemaker access.

1. Add an ML Model

From your Booklet.ai models page, select the type of model you wish to add:

select sagemaker model

Booklet.ai will ask for platform-specific information to connect to the model. For example, if the model is deployed via SageMaker, a list of SageMaker endpoints is available for selection. When adding a custom model, the URL to the model’s HTTP inference URL is required.

2. Add Model Metadata (optional)

model metadata

Once you’ve connected and named your ML Model, the Add an ML Model screen contains optional fields to configure metadata:

  • Image URL: Use this field to give your model an icon. Provide a URL to an image hosted in another location, such as AWS S3 URL.
  • Description: Use this field to provide more information about your model and how to use the web app. This field supports Markdown formatting, so you can add headers, links and many other formatting options.

3. Configure the Web App demo form

web app example

The model inference UI above uses a feature schema to provide a default input value and post-processing to render the prediction in an HTML table.

Create your responsive model inference UI by providing a features schema and configuring optional post-processing of the result in the UI.

You can make the model private (visible only to your team) or public (visible to everyone). By default, models are private.

For more details on the available web app configuration options see our web app documentation.

4. Enable the HTTP Endpoint (optional)

You can optionally expose an HTTP API to allow developers to perform model inference programmatically. The API utilizes the same feature schema as your web app for consistency.

For more info about the HTTP endpoint, please reach out at support@booklet.ai.

5. Integrate the Model (optional)

By configuring sources and destinations, it’s also easy to integrate your model results into the business processes that already exist within your organization. We support:

  • Redshift, Snowflake, Postgres, and other Data Warehouses
  • Mailchimp
  • Intercom

To request an integration or learn more, reach out to support@booklet.ai.