Build an ASYNC object detection pipeline

In this tutorial you'll build your first ASYNC object detection pipeline and trigger the pipeline to process some images.

#Build via no-code Console

After onboarding, you will be redirected to the Pipeline page on the left sidebar, where you can build your first VDP pipeline by clicking Set up your first pipeline and forever change the way you approach unstructured data processing workflow development.

Empty pipeline list page of the VDP Console
Empty pipeline list page of the VDP Console
TIP

You can follow the tutorial via the Pipeline page. Or, you can navigate to the Source, Model and Destination page to create each component first and use them to configure a pipeline in Pipeline page later.

#Add a HTTP source

A HTTP source accepts HTTP requests with image payloads to be processed by a pipeline.

To set it up,

  1. click the Pipeline mode ▾ drop-down and choose Async,
  2. click the Source type ▾ drop-down and choose HTTP, and
  3. click Next.
Add a HTTP source to set up an async pipeline in the VDP Console
Add a HTTP source to set up an async pipeline in the VDP Console
INFO

Check our growing list of Source Connectors.

#Import a model from GitHub

To process images, here we import a model from our public GitHub repo instill-ai/model-yolov7-dvc.

To set it up,

  1. give your model a unique ID,
  2. [optional] add description,
  3. click the Model source ▾ drop-down and choose GitHub,
  4. fill in the GitHub repository URL instill-ai/model-yolov7-dvc, and
  5. click Set up.
Import a model from a GitHub repo via VDP Console
Import a model from a GitHub repo via VDP Console

VDP will fetch all the releases of the GitHub repository. Each release is converted into one model instance, using the release tag as the corresponding model instance ID.

INFO

Check our growing list of model sources to learn about how to import models from other platforms.

#Deploy a model instance of the imported model

Once the model is imported,

  1. click the Model instances ▾ drop-down,
  2. pick one model instance, and
  3. click Deploy to put it online.
Deploy a model instance via VDP Console
Deploy a model instance via VDP Console

#Add a PostgreSQL destination

To set it up,

  1. give your destination a unique ID,
  2. [optional] add description,
  3. click the Destination type ▾ drop-down and choose Postgres, and
  4. fill in the required fields.
Add a PostgreSQL destination to create a pipeline in the VDP console
Add a PostgreSQL destination to create a pipeline in the VDP console
INFO

Make sure the PostgreSQL is accessible by VDP using the specified Host and Port, and the tutorial database has been created in advance.

#Set up the pipeline

Almost done! Just

  1. give your pipeline a unique ID,
  2. [optional] add description, and
  3. click Set up.
Set up an pipeline in the VDP Console
Set up an pipeline in the VDP Console

Now you should see the newly created ASYNC pipeline on the Pipeline page 🎉

Pipeline list page of the VDP Console
Pipeline list page of the VDP Console

#Build via low-code

You can programmatically build a ASYNC pipeline via REST API.


curl -X POST http://localhost:8080/v1alpha/source-connectors -d '{
"id": "source-http",
"source_connector_definition": "source-connector-definitions/source-http",
"connector": {
"configuration": {}
}
}'
curl -X POST http://localhost:8080/v1alpha/models -d '{
"id": "yolov7",
"model_definition": "model-definitions/github",
"configuration": {
"repository": "instill-ai/model-yolov7-dvc"
}
}'
curl -X POST http://localhost:8080/v1alpha/models/yolov7/instances/v1.0-cpu/deploy
curl -X POST http://localhost:8080/v1alpha/destination-connectors -d '{
"id": "postgres-db",
"destination_connector_definition": "destination-connector-definitions/destination-postgres",
"connector": {
"description": "The PostgreSQL database in my basement",
"configuration": {
"host": "100.113.68.102",
"port": 5432,
"database": "tutorial",
"schema": "public",
"username": "postgres",
"password": "password",
"ssl": false
}
}
}'
curl -X POST http://localhost:8080/v1alpha/pipelines -d '{
"id": "detection",
"description": "A magic pipeline to detect objects in images",
"recipe": {
"source": "source-connectors/source-http",
"model_instances": [
"models/yolov7/instances/v1.0-cpu"
],
"destination": "destination-connectors/postgres-db"
}
}'

Step 1: Add a HTTP source


curl -X POST http://localhost:8080/v1alpha/source-connectors -d '{
"id": "source-http",
"source_connector_definition": "source-connector-definitions/source-http",
"connector": {
"configuration": {}
}
}'
curl -X POST http://localhost:8080/v1alpha/models -d '{
"id": "yolov7",
"model_definition": "model-definitions/github",
"configuration": {
"repository": "instill-ai/model-yolov7-dvc"
}
}'
curl -X POST http://localhost:8080/v1alpha/models/yolov7/instances/v1.0-cpu/deploy
curl -X POST http://localhost:8080/v1alpha/destination-connectors -d '{
"id": "postgres-db",
"destination_connector_definition": "destination-connector-definitions/destination-postgres",
"connector": {
"description": "The PostgreSQL database in my basement",
"configuration": {
"host": "100.113.68.102",
"port": 5432,
"database": "tutorial",
"schema": "public",
"username": "postgres",
"password": "password",
"ssl": false
}
}
}'
curl -X POST http://localhost:8080/v1alpha/pipelines -d '{
"id": "detection",
"description": "A magic pipeline to detect objects in images",
"recipe": {
"source": "source-connectors/source-http",
"model_instances": [
"models/yolov7/instances/v1.0-cpu"
],
"destination": "destination-connectors/postgres-db"
}
}'

Step 2: Import a model from GitHub


curl -X POST http://localhost:8080/v1alpha/source-connectors -d '{
"id": "source-http",
"source_connector_definition": "source-connector-definitions/source-http",
"connector": {
"configuration": {}
}
}'
curl -X POST http://localhost:8080/v1alpha/models -d '{
"id": "yolov7",
"model_definition": "model-definitions/github",
"configuration": {
"repository": "instill-ai/model-yolov7-dvc"
}
}'
curl -X POST http://localhost:8080/v1alpha/models/yolov7/instances/v1.0-cpu/deploy
curl -X POST http://localhost:8080/v1alpha/destination-connectors -d '{
"id": "postgres-db",
"destination_connector_definition": "destination-connector-definitions/destination-postgres",
"connector": {
"description": "The PostgreSQL database in my basement",
"configuration": {
"host": "100.113.68.102",
"port": 5432,
"database": "tutorial",
"schema": "public",
"username": "postgres",
"password": "password",
"ssl": false
}
}
}'
curl -X POST http://localhost:8080/v1alpha/pipelines -d '{
"id": "detection",
"description": "A magic pipeline to detect objects in images",
"recipe": {
"source": "source-connectors/source-http",
"model_instances": [
"models/yolov7/instances/v1.0-cpu"
],
"destination": "destination-connectors/postgres-db"
}
}'

Step 3: Deploy a model instance

Choose the model instance v1.0-cpu to deploy.


curl -X POST http://localhost:8080/v1alpha/source-connectors -d '{
"id": "source-http",
"source_connector_definition": "source-connector-definitions/source-http",
"connector": {
"configuration": {}
}
}'
curl -X POST http://localhost:8080/v1alpha/models -d '{
"id": "yolov7",
"model_definition": "model-definitions/github",
"configuration": {
"repository": "instill-ai/model-yolov7-dvc"
}
}'
curl -X POST http://localhost:8080/v1alpha/models/yolov7/instances/v1.0-cpu/deploy
curl -X POST http://localhost:8080/v1alpha/destination-connectors -d '{
"id": "postgres-db",
"destination_connector_definition": "destination-connector-definitions/destination-postgres",
"connector": {
"description": "The PostgreSQL database in my basement",
"configuration": {
"host": "100.113.68.102",
"port": 5432,
"database": "tutorial",
"schema": "public",
"username": "postgres",
"password": "password",
"ssl": false
}
}
}'
curl -X POST http://localhost:8080/v1alpha/pipelines -d '{
"id": "detection",
"description": "A magic pipeline to detect objects in images",
"recipe": {
"source": "source-connectors/source-http",
"model_instances": [
"models/yolov7/instances/v1.0-cpu"
],
"destination": "destination-connectors/postgres-db"
}
}'

Step 4: Add a PostgreSQL destination


curl -X POST http://localhost:8080/v1alpha/source-connectors -d '{
"id": "source-http",
"source_connector_definition": "source-connector-definitions/source-http",
"connector": {
"configuration": {}
}
}'
curl -X POST http://localhost:8080/v1alpha/models -d '{
"id": "yolov7",
"model_definition": "model-definitions/github",
"configuration": {
"repository": "instill-ai/model-yolov7-dvc"
}
}'
curl -X POST http://localhost:8080/v1alpha/models/yolov7/instances/v1.0-cpu/deploy
curl -X POST http://localhost:8080/v1alpha/destination-connectors -d '{
"id": "postgres-db",
"destination_connector_definition": "destination-connector-definitions/destination-postgres",
"connector": {
"description": "The PostgreSQL database in my basement",
"configuration": {
"host": "100.113.68.102",
"port": 5432,
"database": "tutorial",
"schema": "public",
"username": "postgres",
"password": "password",
"ssl": false
}
}
}'
curl -X POST http://localhost:8080/v1alpha/pipelines -d '{
"id": "detection",
"description": "A magic pipeline to detect objects in images",
"recipe": {
"source": "source-connectors/source-http",
"model_instances": [
"models/yolov7/instances/v1.0-cpu"
],
"destination": "destination-connectors/postgres-db"
}
}'

Step 5: Set up your first pipeline


curl -X POST http://localhost:8080/v1alpha/source-connectors -d '{
"id": "source-http",
"source_connector_definition": "source-connector-definitions/source-http",
"connector": {
"configuration": {}
}
}'
curl -X POST http://localhost:8080/v1alpha/models -d '{
"id": "yolov7",
"model_definition": "model-definitions/github",
"configuration": {
"repository": "instill-ai/model-yolov7-dvc"
}
}'
curl -X POST http://localhost:8080/v1alpha/models/yolov7/instances/v1.0-cpu/deploy
curl -X POST http://localhost:8080/v1alpha/destination-connectors -d '{
"id": "postgres-db",
"destination_connector_definition": "destination-connector-definitions/destination-postgres",
"connector": {
"description": "The PostgreSQL database in my basement",
"configuration": {
"host": "100.113.68.102",
"port": 5432,
"database": "tutorial",
"schema": "public",
"username": "postgres",
"password": "password",
"ssl": false
}
}
}'
curl -X POST http://localhost:8080/v1alpha/pipelines -d '{
"id": "detection",
"description": "A magic pipeline to detect objects in images",
"recipe": {
"source": "source-connectors/source-http",
"model_instances": [
"models/yolov7/instances/v1.0-cpu"
],
"destination": "destination-connectors/postgres-db"
}
}'

You can programmatically build a ASYNC pipeline via REST API.

Step 1: Add a HTTP source

Step 2: Import a model from GitHub

Step 3: Deploy a model instance

Choose the model instance v1.0-cpu to deploy.

Step 4: Add a PostgreSQL destination

Step 5: Set up your first pipeline


curl -X POST http://localhost:8080/v1alpha/source-connectors -d '{
"id": "source-http",
"source_connector_definition": "source-connector-definitions/source-http",
"connector": {
"configuration": {}
}
}'
curl -X POST http://localhost:8080/v1alpha/models -d '{
"id": "yolov7",
"model_definition": "model-definitions/github",
"configuration": {
"repository": "instill-ai/model-yolov7-dvc"
}
}'
curl -X POST http://localhost:8080/v1alpha/models/yolov7/instances/v1.0-cpu/deploy
curl -X POST http://localhost:8080/v1alpha/destination-connectors -d '{
"id": "postgres-db",
"destination_connector_definition": "destination-connector-definitions/destination-postgres",
"connector": {
"description": "The PostgreSQL database in my basement",
"configuration": {
"host": "100.113.68.102",
"port": 5432,
"database": "tutorial",
"schema": "public",
"username": "postgres",
"password": "password",
"ssl": false
}
}
}'
curl -X POST http://localhost:8080/v1alpha/pipelines -d '{
"id": "detection",
"description": "A magic pipeline to detect objects in images",
"recipe": {
"source": "source-connectors/source-http",
"model_instances": [
"models/yolov7/instances/v1.0-cpu"
],
"destination": "destination-connectors/postgres-db"
}
}'

#Trigger your pipeline for the first time

Now that if all components are in the positive state, the detection pipeline will be automatically activated. You can make a request to trigger the pipeline to process multiple images within a batch via remote image URLs, Base64 or multipart:

cURL(url)
cURL(base64)
cURL(multipart)
Copy

curl -X POST http://localhost:8080/v1alpha/pipelines/detection/trigger -d '{
"inputs": [
{
"image_url": "https://artifacts.instill.tech/imgs/dog.jpg"
},
{
"image_url": "https://artifacts.instill.tech/imgs/polar-bear.jpg"
}
]
}'

in which http://localhost:8080 is the api-gateway default URL.

INFO

api-gateway is the single point of entry into the backend services. Please check the VDP system architecture.

A HTTP response will return


{
"data_mapping_indices": [
"01GDR4ZW7W4T2H2G8MK79Y49PG",
"01GDR4ZW7W4T2H2G8MK8AR1T2B"
],
"model_instance_outputs": []
}

and in the PostgreSQL tutorial database, you should see


tutorial> SELECT * FROM _airbyte_raw_vdp
+--------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+----------------------------+
| _airbyte_ab_id | _airbyte_data | _airbyte_emitted_at |
|--------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+----------------------------|
| 36474aa5-9257-463f-920b-95b3399cd5f4 | {"index": "01GDR4ZW7W4T2H2G8MK79Y49PG", "pipeline": {"name": "pipelines/detection", "recipe": {"source": "source-connectors/source-http", "destination": "destination-connectors/postgres-db", "model_instances": ["models/yolov7/instances/v1.0-cpu"]}}, "detection": {"objects": [{"score": 0.9597808, "category": "dog", "bounding_box": {"top": 102, "left": 324, "width": 208, "height": 405}}, {"score": 0.92909366, "category": "dog", "bounding_box": {"top": 198, "left": 130, "width": 198, "height": 2... | 2022-09-24 16:23:55.288+00 |
| 95b9f812-6ff5-4708-9d3f-8e6c97bf4dbe | {"index": "01GDR4ZW7W4T2H2G8MK8AR1T2B", "pipeline": {"name": "pipelines/detection", "recipe": {"source": "source-connectors/source-http", "destination": "destination-connectors/postgres-db", "model_instances": ["models/yolov7/instances/v1.0-cpu"]}}, "detection": {"objects": [{"score": 0.9475409, "category": "bear", "bounding_box": {"top": 457, "left": 1372, "width": 1300, "height": 2175}}]}, "model_instance": "models/yolov7/instances/v1.0-cpu"} | 2022-09-24 16:23:55.288+00 |
+--------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+----------------------------+
SELECT 2

🙌 That's it! You just built your first ASYNC object detection pipeline and triggered it to convert unstructured image data into structured and analysable insight.

#What's next?

Check out Learn VDP. If you have any problem at all, join our Discord to get community support.

Last updated: 1/14/2023, 8:35:23 PM