Explore our comprehensive collection of examples to see the capabilities of Instill Core in action. Whether you're interested in pipelines, models, tutorials, or notebooks, these resources showcase practical AI and data applications, and guide you through how to leverage our services effectively.
#Pipelines
Discover the practical and diverse uses of Instill VDP in real-world scenarios. Click here to explore all of the public pipelines available on Instill Cloud.
- Web scraper: Automate data extraction from websites with ease.
- Audio transcription: Convert spoken content into text quickly and accurately.
- Ask your catalog: Query your Instill Catalog to retrieve relevant information efficiently.
- Chinese content writer: Create high-quality Chinese text content with Fireworks AI.
- PicassoAI art generator: Craft unique and creative artwork using Mistral AI and OpenAI.
- Generate short-film scripts: Develop short-film scripts using Mistral AI.
- Interview helper: Prepare impressive STAR interview responses with the Groq AI component.
- Customer ticket triage: Streamline customer support by automating ticket triage and email drafting.
- Project manager assistant: Organize and manage Jira issues efficiently with AI categorization.
- Contract reviewer: Analyze and review contracts to ensure compliance and accuracy.
- Resume screening: Screen resumes effectively using OpenAI's structured outputs feature.
- Explain and translate: Provide clear explanations of topics in multiple languages.
- GitHub issue summarization: Generate concise summaries of GitHub issues using Claude-3.5-Sonnet to enhance project management.
#Models
Explore the collection of advanced AI models that are already pre-prepared and ready to be served on the Instill Model MLOps platform. Click here to explore all of the models currently served on Instill Cloud.
- gte-Qwen2-1.5B-instruct: A powerful instruction-based model for text embedding.
- llama2-7b-chat: Engage in natural and context-aware conversations with this chat model.
- llama3-8b-instruct: Advanced instruction-following capabilities for complex tasks.
- llamacode-7b: Specialize in coding and programming tasks with this dedicated model.
- llava-1-6-13b: A robust model for multi-modal tasks involving vision and language.
- mobilenetv2: Efficient model for mobile and embedded vision applications.
- stable-diffusion-xl: Generate high-quality images with advanced diffusion techniques.
- stella-en-1.5B-v5: A versatile model for various natural language embedding tasks.
- tinyllama: A compact yet powerful model designed for lightweight applications.
- yolov7-stomata: Specialized custom fine-tuned model for detecting plant stomata in images.
- yolov7: A robust model for general object detection tasks.
- zephyr-7b: Advanced model designed for high-performance language tasks.
#Tutorials
Our tutorials provide hands-on guides to building and deploying AI solutions using our platform. Each tutorial offers step-by-step instructions to help you create effective and customized AI applications.
- Build a customer ticket classifier & email drafting pipeline: Learn how to automate customer support workflows and email drafting.
- Build a webpage summarization and Q&A pipeline: Create a pipeline for summarizing web pages and answering questions.
- Build an AIGC x Web3 pipeline: Combine AI-generation with blockchain technology to register your AI-generated data into Web3 assets.
- Serve custom models on Instill Core: Discover how to deploy and manage custom models using Instill Core.
- Serve Llama2-7b-Chat Locally on Instill Core: Deploy the Llama2-7b-Chat model locally for enhanced chat capabilities.
- Serve LLaVA with Instill Model: Implement and serve the LLaVA model using Instill Core and Instill Cloud.
#Notebooks
Explore interactive notebooks that provide practical examples of AI applications and leverage our products and services within Python environments.
- Instance Segmentation with Instill Cloud: Perform instance segmentation tasks using a custom fine-tuned model served on Instill Cloud.
- Generating Structured Outputs from LLMs: Explore structured output generation from LLMs and overcome key challenges with Instill VDP.
More notebooks coming soon in the Instill AI Cookbook! 🚀