Docker Compose is the a straightforward way to set up Instill Core in local machines or remote instances.
Instructions on this page have been tested on macOS and Ubuntu 22.04. Nevertheless, we strongly recommend using our auto-deployment tool, Instill CLI, for the deployment process.
#Prerequisites
Make sure you have the prerequisites set up:
- macOS or Linux - Instill Core works on macOS or Linux, but does not support Windows at the moment.
- Docker and Docker Compose - Instill Core uses Docker Compose to run all services at local. See the official instructions and the Docker Resource Requirements for Instill Core.
#Setup
The code in the main
branch reflects ongoing development progress for the
next release and may not work as expected. If you need a stable alpha version,
use the latest release instead.
On your workstation, run:
git clone -b v0.5.0-alpha https://github.com/instill-ai/core.git && cd coremake all
Running this command will launch both Instill VDP and Instill Model simultaneously.
Here's a handy tip: You can launch individual Instill projects by specifying the project name using the PROJECT
parameter.
If you're interested in creating AI workflows without the complexities of model serving,
use the following command to launch Instill VDP:
Alternatively, if your primary focus is on building AI models, you can launch Instill Model with the following command:
# Launch Instill Model onlymake all PROJECT=model
For launching the foundational services, you can use the following command:
# Launch the foundation servicesmake all PROJECT=core
These commands provide you with the flexibility to deploy the specific components you need for your AI projects. Feel free to make your choice, and once all services are up and running,the Console UI is ready to go at http://localhost:3000. Proceed by following the authorisation guide.
#Shutdown
To shutdown and clean up all Instill Core resources, run
make down