🐤

DCanary Documentation

Decentralized CI/CD Pipeline System

Welcome to the Future of CI/CD

DCanary is the world's first truly decentralized CI/CD platform that brings blockchain-level security and transparency to software deployment. Built on the Internet Computer, it eliminates single points of failure while ensuring cryptographic verification and distributed consensus for every build.

Zero Downtime
Cryptographic Security
Consensus Verification
Immutable History

What's Different?

Unlike traditional platforms that rely on centralized infrastructure, DCanary leverages blockchain technology to create an immutable, transparent, and censorship-resistant deployment pipeline.

Traditional CI/CD
❌ Single point of failure
DCanary
✅ Distributed consensus

Tamper-Proof

Every build is cryptographically signed and verified across multiple independent nodes.

Decentralized

No central authority or single point of failure. Runs entirely on blockchain infrastructure.

Developer First

Familiar CLI tools, GitHub integration, and workflow patterns you already know.

Ready to Get Started?

Jump into our quick start guide and have your first decentralized pipeline running in just 5 minutes.

Why Choose DCanary?

Cryptographic Security

Every build is cryptographically signed and verified across multiple nodes

True Decentralization

No central authority, runs entirely on Internet Computer blockchain

Immutable History

All pipeline executions are permanently recorded on-chain

Consensus Verification

Multiple executors must agree before marking builds as successful

Developer Friendly

Simple CLI, GitHub integration, familiar workflow patterns

Global Availability

Distributed across the globe with automatic failover

Quick Start Guide

Get Running in 5 Minutes

Follow these steps to set up your first decentralized CI/CD pipeline

1

Install DCanary CLI

npm install -g @dcanary/cli

Install the DCanary command-line interface globally on your system. This provides all the tools needed to create, configure, and manage your decentralized CI/CD pipelines.

Note: Requires Node.js 16+ and npm. The CLI will automatically configure your identity and connect to the Internet Computer network.

2

Initialize Your Project

dcanary init --type nodejs
# Other project types:
dcanary init --type python # Python projects
dcanary init --type rust # Rust projects
dcanary init --type docker # Docker-based builds

Set up DCanary configuration for your project type. This creates a dcanary.yaml file with sensible defaults and project-specific build commands.

What this does: Creates configuration files, sets up build stages, configures secrets management, and establishes webhook endpoints for your repository.

3

Create Your Pipeline

dcanary pipeline create --name "My App" --repo "github:user/repo"
# Additional options:
--branch main # Specify default branch
--environment production # Set deployment environment
--executors 3 # Number of consensus executors

Connect your GitHub repository and create your first pipeline. This registers your project with the DCanary network and sets up the necessary smart contracts on the Internet Computer.

Behind the scenes: This deploys your pipeline configuration to a canister, generates unique webhook URLs, and enrolls your project in the consensus network.

Deploy and Monitor

dcanary deploy && dcanary status
# Monitor in real-time:
dcanary logs --follow # Live build logs
dcanary dashboard --web # Web dashboard
dcanary consensus --details # Consensus status

Deploy your pipeline to the network and monitor its status. Your pipeline is now live and will automatically trigger on code pushes, executing builds across multiple independent nodes.

Success! Your decentralized CI/CD pipeline is now active. Configure webhooks in your repository settings to enable automatic builds on code changes.

Prerequisites

  • • Node.js 16+ and npm installed
  • • Git repository with proper access tokens
  • • Internet Computer account (for mainnet deployment)
  • • DFX SDK for local development

Installation Guide

DCanary CLI Installation

Choose your preferred installation method

Option 1: NPM (Recommended)

npm install -g @dcanary/cli
# Verify installation
dcanary --version

Option 2: Direct Download

curl -fsSL https://install.dcanary.dev | bash
* Installs latest version directly from our CDN

Option 3: Build from Source

git clone https://github.com/modaniels/dcanary.git
cd dcanary && npm install && npm run build
npm link

System Requirements

Component Minimum Recommended
Node.js v16.0.0 v18.0.0+
RAM 512 MB 2 GB+
Storage 100 MB 1 GB+
Network Basic internet Stable broadband
OS Linux, macOS, Windows Linux/macOS preferred

System Architecture

Decentralized Architecture Overview

Built on Internet Computer for true decentralization


┌─────────────────────────────────────────────────────────────────┐
│                    Internet Computer Network                    │
├─────────────────────────────────────────────────────────────────┤
│  ┌─────────────────┐  ┌─────────────────┐  ┌─────────────────┐  │
│  │   Webhook       │  │   Pipeline      │  │  Verification   │  │
│  │   Canister      │  │   Config        │  │   Canister      │  │
│  │                 │  │   Canister      │  │                 │  │
│  └─────────────────┘  └─────────────────┘  └─────────────────┘  │
│           │                     │                     │          │
│           ▼                     ▼                     ▼          │
│  ┌─────────────────────────────────────────────────────────────┐  │
│  │              Build Executor Canister                       │  │
│  │         (Orchestrates off-chain executors)                 │  │
│  └─────────────────────────────────────────────────────────────┘  │
│                              │                                   │
└──────────────────────────────┼───────────────────────────────────┘
                               │
        ┌──────────────────────┼──────────────────────┐
        │                      │                      │
        ▼                      ▼                      ▼
┌─────────────────┐  ┌─────────────────┐  ┌─────────────────┐
│   Executor #1   │  │   Executor #2   │  │   Executor #3   │
│   (Off-chain)   │  │   (Off-chain)   │  │   (Off-chain)   │
│                 │  │                 │  │                 │
│ ┌─────────────┐ │  │ ┌─────────────┐ │  │ ┌─────────────┐ │
│ │Docker/K8s/VM│ │  │ │Docker/K8s/VM│ │  │ │Docker/K8s/VM│ │
│ │Build Environment│ │ │Build Environment│ │ │Build Environment│ │
│ └─────────────┘ │  │ └─────────────┘ │  │ └─────────────┘ │
└─────────────────┘  └─────────────────┘  └─────────────────┘
                                

On-Chain Components

Webhook Canister

Receives and validates Git push events from GitHub, GitLab, and other SCM providers

Functions: Event validation, signature verification, rate limiting, and pipeline triggering
Pipeline Config

Stores immutable build configurations, secrets, and deployment instructions

Features: Version control, encrypted secrets, multi-environment support, rollback capabilities
Build Executor

Orchestrates distributed build execution across multiple independent nodes

Capabilities: Load balancing, fault tolerance, consensus coordination, result aggregation
Verification

Validates consensus results and permanently stores execution history

Security: Cryptographic verification, Byzantine fault tolerance, immutable audit logs

Off-Chain Components

Executor Nodes

Independent build execution environments

Build Environments

Containerized or VM-based build isolation

Network Layer

Secure communication channels

Crypto Verification

Digital signatures and hash verification

Real-World Examples

React Web Application

Modern frontend deployment pipeline

Deploy a React application with automated testing, building, and deployment to a CDN. This example shows how to handle modern frontend workflows with dependency management, testing, and optimized production builds.

dcanary.yaml - React Project

name: react-app-pipeline
version: "1.0"

environment:
  image: "node:18-alpine"
  variables:
    NODE_ENV: production
    CI: true

stages:
  - name: install
    commands:
      - npm ci --production=false
    cache:
      paths: ["node_modules/"]
      key: "${checksum:package-lock.json}"

  - name: lint
    commands:
      - npm run lint
      - npm run type-check
    parallel: true

  - name: test
    commands:
      - npm test -- --coverage --watchAll=false
    coverage: true
    artifacts:
      - coverage/

  - name: build
    commands:
      - npm run build
    artifacts:
      - build/
      - package.json
    
  - name: deploy
    commands:
      - aws s3 sync build/ s3://$S3_BUCKET --delete
      - aws cloudfront create-invalidation --distribution-id $CLOUDFRONT_ID --paths "/*"
    when: branch == "main"
    environment:
      variables:
        AWS_REGION: us-east-1

secrets:
  - NPM_TOKEN
  - AWS_ACCESS_KEY_ID
  - AWS_SECRET_ACCESS_KEY
  - S3_BUCKET
  - CLOUDFRONT_ID

notifications:
  slack:
    webhook: $SLACK_WEBHOOK
    channels: ["#deployments"]

Key Features

  • Intelligent dependency caching
  • Parallel lint and type checking
  • Code coverage reporting
  • Automated CDN deployment

Typical Timeline

  • Install dependencies:45s
  • Lint + Type check:30s
  • Run tests:2m 15s
  • Build production:1m 30s
  • Total time:~5 minutes

Smart Canisters

Webhook Canister

Git Event Processing Hub

Receives and processes Git webhook events from GitHub, GitLab, and other SCM providers. Acts as the entry point for all pipeline triggers with enterprise-grade security.

Key Functions

  • Webhook signature validation
  • Event parsing and normalization
  • Repository authentication
  • Pipeline trigger orchestration

Supported Events

  • Push events (all branches)
  • Pull request events
  • Tag creation/deletion
  • Release events

Pipeline Configuration Canister

Immutable Build Definitions

Stores immutable pipeline configurations, build scripts, and encrypted secrets. Provides version control for your CI/CD definitions with blockchain-level security.

pipeline-config.json
{
  "name": "my-app-pipeline",
  "stages": [
    {"name": "install", "commands": ["npm ci"]},
    {"name": "test", "commands": ["npm test"]},
    {"name": "build", "commands": ["npm run build"]}
  ],
  "environment": "node:18",
  "secrets": ["NPM_TOKEN", "API_KEY"]
}

Build Executor Canister

Distributed Build Orchestration

Orchestrates distributed build execution across multiple off-chain executor nodes. Manages job distribution, monitoring, and result collection with consensus verification.

Manages job distribution, monitoring, and result collection.

Job Scheduling

Smart load balancing across executor nodes

Real-time Monitoring

Live build progress and health metrics

Result Validation

Cryptographic consensus verification

Consensus Mechanism

How Consensus Works

Byzantine Fault Tolerant Verification

1

Job Distribution

Build jobs are distributed to multiple independent executor nodes (minimum 3, typically 5-7) across different geographical regions for maximum redundancy and security. Each executor receives identical build instructions but operates in complete isolation.

Distribution Strategy:
  • Geographic diversity: Executors spread across continents
  • Infrastructure variety: Different cloud providers and hardware
  • Network isolation: Each executor operates independently
  • Load balancing: Jobs distributed based on current capacity
2

Parallel Execution

Each executor independently runs the same build in completely isolated environments using containerization or virtual machines. This ensures consistent results while preventing any single point of manipulation.

Isolation Methods:
  • • Docker containers with fresh images
  • • Virtual machines with clean snapshots
  • • Kubernetes pods with network policies
  • • Sandboxed execution environments
Security Measures:
  • • No shared file systems or networks
  • • Cryptographic source code verification
  • • Timestamped execution logs
  • • Resource usage monitoring
3

Result Submission

Executors submit cryptographically signed results including logs, artifacts, and exit codes with tamper-proof digital signatures. Each submission includes comprehensive metadata for verification.

Submission Contents:
Execution Data:
  • • Complete build logs
  • • Exit codes and error messages
  • • Resource usage statistics
  • • Execution timestamps
Verification Data:
  • • SHA-256 hashes of all artifacts
  • • Ed25519 digital signatures
  • • Executor identity certificates
  • • Environment fingerprints

Consensus Verification

Results are compared and verified using Byzantine fault tolerance - 2/3 majority required for success with full cryptographic audit trail. The verification process ensures no single malicious actor can compromise the build integrity.

Consensus Rules:
  • • ≥67% executors must agree for success
  • • Identical artifact hashes required
  • • Exit codes must match exactly
  • • Log patterns verified for consistency
Failure Scenarios:
  • • <67% agreement = consensus failed
  • • Timeout = automatic retry
  • • Hash mismatch = security alert
  • • Minority results = audit logged

Result: When consensus is reached, the verified artifacts are cryptographically sealed and the execution record is permanently written to the blockchain, creating an immutable deployment history.

Benefits

  • Eliminates single points of failure through distributed execution
  • Prevents malicious build tampering with cryptographic verification
  • Ensures reproducible builds across all executor environments
  • Provides cryptographic proof of build integrity
  • Enables transparent auditing and compliance tracking

Failure Handling

  • Executor timeouts handled gracefully with automatic failover
  • Minority disagreements logged for forensic analysis
  • Automatic retry mechanisms with exponential backoff
  • Fallback to alternative executors in different regions
  • Detailed failure forensics with complete audit trails

Security Model

Security Layers

Enterprise-Grade Protection

Cryptographic Identity

Ed25519 signatures for all transactions and communications

Encrypted Secrets

AES-256 encryption for all sensitive data and secrets

Network Security

TLS 1.3 for all communications and data transfer

Pipeline Configuration

Configuration Structure

DCanary uses YAML configuration files to define your build and deployment pipelines. The configuration is declarative, meaning you describe what you want to happen, not how to do it. This ensures consistency and reproducibility across all executor nodes.

Basic Configuration

dcanary.yaml
name: my-awesome-app
version: "1.0"

# Build environment
environment:
  image: "node:18-alpine"
  variables:
    NODE_ENV: production
    
# Pipeline stages
stages:
  - name: install
    commands:
      - npm ci --production=false
      
  - name: lint
    commands:
      - npm run lint
      
  - name: test
    commands:
      - npm test
    coverage: true
    
  - name: build
    commands:
      - npm run build
    artifacts:
      - dist/
      - package.json
      
  - name: deploy
    commands:
      - dfx deploy
    when: branch == "main"

# Secrets management
secrets:
  - NPM_TOKEN
  - DEPLOY_KEY
  
# Notifications
notifications:
  slack:
    webhook: $SLACK_WEBHOOK
    channels: ["#deployments"]

Configuration Sections

Metadata

Project name, version, and description. Used for pipeline identification and management.

Environment

Docker image, environment variables, and runtime configuration for all executors.

Stages

Sequential or parallel build steps with commands, conditions, and artifact handling.

Secrets

Encrypted environment variables and credentials securely distributed to executors.

Advanced Features

Conditional Execution

Branch-based Conditions
stages:
  - name: deploy-staging
    commands:
      - deploy-to-staging.sh
    when: branch == "develop"
    
  - name: deploy-production
    commands:
      - deploy-to-prod.sh
    when: branch == "main" && tag =~ "^v[0-9]+\\."
    
  - name: security-scan
    commands:
      - run-security-scan.sh
    when: files_changed =~ ".*\\.(js|ts|py)$"
Complex Conditions
stages:
  - name: performance-test
    commands:
      - run-perf-tests.sh
    when: |
      (branch == "main" || branch == "develop") &&
      commit_message !~ "\\[skip perf\\]" &&
      time_of_day >= "09:00" && time_of_day <= "17:00"
      
  - name: weekend-maintenance
    commands:
      - maintenance-tasks.sh
    when: day_of_week in ["saturday", "sunday"]

Parallel Execution

Independent Tasks
stages:
  - name: quality-checks
    parallel:
      - name: lint
        commands:
          - npm run lint
          - npm run format:check
      - name: type-check
        commands:
          - npm run type-check
      - name: security-audit
        commands:
          - npm audit --audit-level high
          - snyk test
Matrix Builds
stages:
  - name: test-matrix
    matrix:
      node_version: ["16", "18", "20"]
      os: ["ubuntu", "alpine"]
    commands:
      - npm test
    environment:
      image: "node:${matrix.node_version}-${matrix.os}"
    parallel_limit: 4

Caching and Artifacts

Smart Caching
stages:
  - name: install
    commands:
      - npm ci
    cache:
      paths:
        - "node_modules/"
        - "~/.npm/"
      key: "${checksum:package-lock.json}-${env.NODE_VERSION}"
      restore_keys:
        - "${checksum:package-lock.json}-"
        - "npm-cache-"
      ttl: "7d"
Artifact Management
stages:
  - name: build
    commands:
      - npm run build
    artifacts:
      paths:
        - "dist/"
        - "coverage/"
      name: "build-${env.BUILD_NUMBER}"
      expire_in: "30d"
      public: false
    reports:
      coverage: "coverage/lcov.info"
      test: "test-results.xml"

Environment Management

Multi-Environment Setup

environments:
  development:
    image: "node:18-alpine"
    variables:
      NODE_ENV: development
      DEBUG: "*"
      API_URL: "https://api-dev.example.com"
    
  staging:
    image: "node:18-alpine"
    variables:
      NODE_ENV: staging
      API_URL: "https://api-staging.example.com"
    resources:
      cpu: "1.0"
      memory: "2Gi"
    
  production:
    image: "node:18-alpine"
    variables:
      NODE_ENV: production
      API_URL: "https://api.example.com"
    resources:
      cpu: "2.0"
      memory: "4Gi"
    security:
      read_only_fs: true
      no_new_privileges: true

Resource Specifications

Compute Resources
  • CPU: "0.5", "1.0", "2.0" (cores)
  • Memory: "512Mi", "1Gi", "4Gi"
  • GPU: "nvidia-tesla-v100" (for ML workloads)
  • Storage: "10Gi", "50Gi" (ephemeral)
Security Settings
  • read_only_fs: Mount filesystem as read-only
  • no_new_privileges: Prevent privilege escalation
  • user: Run as specific user ID
  • capabilities: Drop unnecessary Linux capabilities

Webhook Setup

GitHub Integration

1

Get Webhook URL

dcanary webhook get-url --pipeline my-app

Get your unique webhook endpoint URL

2

Configure GitHub Webhook

In your GitHub repository settings:

  • • Go to Settings → Webhooks → Add webhook
  • • Paste your DCanary webhook URL
  • • Select "application/json" content type
  • • Choose events: Push, Pull Request, Release
  • • Set secret (get with: dcanary webhook get-secret)
3

Test Webhook

dcanary webhook test --pipeline my-app

Verify webhook is working correctly

GitLab Integration

Setup Steps

  1. 1. Go to Project Settings → Webhooks
  2. 2. Add your DCanary webhook URL
  3. 3. Select trigger events (Push, Merge Request)
  4. 4. Add secret token for security
  5. 5. Test the webhook connection

Supported Events

  • • Push events
  • • Merge request events
  • • Tag push events
  • • Release events
  • • Pipeline events

Custom Webhook Configuration

Manual Webhook Triggers

curl -X POST https://webhook.dcanary.dev/trigger \
  -H "Content-Type: application/json" \
  -H "X-DCanary-Secret: your-webhook-secret" \
  -d '{
    "repository": "user/repo",
    "ref": "refs/heads/main",
    "commits": [{"id": "abc123", "message": "Update code"}]
  }'

Webhook Security

Secret Validation

All webhooks are validated using HMAC-SHA256

IP Filtering

Restrict webhook sources by IP addresses

Monitoring & Logs

Real-time Monitoring

Build Status Dashboard

dcanary dashboard --web

Launch web dashboard at http://localhost:8080

CLI Monitoring

dcanary status --live

Watch build progress in real-time

Monitoring Metrics

Build Duration

Average: 3m 45s

Success Rate

98.5% (last 30 days)

Active Executors

5 nodes online

Log Management

Viewing Build Logs

Live Logs
dcanary logs --follow <build-id>
Historical Logs
dcanary logs --build <build-id>
Executor Logs
dcanary logs --executor <executor-id>
Error Logs
dcanary logs --errors --last 24h

Log Filtering & Search

# Filter by log level
dcanary logs --level error,warn

# Search in logs
dcanary logs --search "compilation failed"

# Filter by time range
dcanary logs --since "2024-01-15 10:00" --until "2024-01-15 11:00"

# Export logs
dcanary logs --export json > build-logs.json

Alerting & Notifications

Slack Integration

notifications:
  slack:
    webhook: $SLACK_WEBHOOK
    channels: ["#deployments", "#alerts"]
    on_success: true
    on_failure: true
    on_timeout: true

Email Alerts

notifications:
  email:
    smtp_server: "smtp.gmail.com"
    recipients: ["team@company.com"]
    on_failure: true
    digest: "daily"

Troubleshooting Guide

Common Issues & Solutions

Build Consensus Failed

Executors couldn't reach consensus on build results. This happens when different executors produce different outputs or when insufficient executors complete the build.

Diagnostic Commands
dcanary logs --consensus-details <build-id>
dcanary executor status --all
dcanary debug consensus <build-id>
Common Causes & Fixes
  • Non-deterministic builds: Remove timestamps, random values, or file ordering issues
  • Environment differences: Ensure all executors use identical base images
  • Network issues: Check executor connectivity and timeouts
  • Source code issues: Verify all executors have same source version

Webhook Not Triggering

Pipeline not starting on Git push events. This is usually due to webhook configuration issues or network connectivity problems.

Debugging Steps
dcanary webhook test --url <webhook-url>

Test webhook endpoint accessibility

dcanary webhook logs --last 24h

Check recent webhook delivery attempts

dcanary webhook validate --repo <repo>

Verify webhook configuration

Verification Checklist
  • Webhook URL is correct in repository settings
  • Webhook secret matches configuration
  • Repository has public webhook access
  • Branch/tag filters are configured correctly
  • Webhook canister is deployed and running

Executor Connection Issues

Executors not joining the network or becoming unavailable during builds. This affects consensus and build reliability.

Network Diagnostics
dcanary executor diagnostics
dcanary network ping --all
dcanary executor health
Identity Issues
  • • Check executor private key permissions
  • • Verify Internet Computer identity
  • • Ensure sufficient ICP balance
  • • Validate identity certificates
Resource Problems
  • • Monitor CPU and memory usage
  • • Check disk space availability
  • • Verify network bandwidth
  • • Review docker daemon status

Build Timeouts

Builds taking too long or timing out before completion. This can be due to resource constraints, inefficient build processes, or network issues.

Timeout Configuration
# Global timeout settings
timeout:
  global: "45m"        # Maximum total build time
  stage: "15m"         # Maximum per-stage time
  consensus: "5m"      # Consensus agreement timeout

# Per-stage timeouts
stages:
  - name: long-running-task
    commands:
      - run-heavy-process.sh
    timeout: "30m"
    
  - name: quick-task
    commands:
      - quick-script.sh
    timeout: "2m"
Optimization Strategies
  • Increase caching: Cache dependencies, build artifacts, and Docker layers
  • Parallelize stages: Run independent tasks concurrently
  • Resource scaling: Increase CPU/memory for slow executors
  • Build optimization: Remove unnecessary dependencies and steps

Advanced Debugging Techniques

Deep Inspection

System Health Check
dcanary health --verbose --export health.json

Comprehensive system health report with metrics

Consensus Analysis
dcanary consensus analyze <build-id> --diff

Compare executor results and identify differences

Network Topology
dcanary network topology --live

Real-time network connectivity visualization

Performance Analysis

Build Performance
dcanary perf analyze --build <build-id>

Detailed timing and resource usage analysis

Bottleneck Detection
dcanary perf bottlenecks --last 7d

Identify performance bottlenecks over time

Resource Trending
dcanary metrics --grafana --export

Export metrics for external monitoring tools

Getting Help & Support

Community Discord

Join our active community for real-time help and discussions

Join Discord

GitHub Issues

Report bugs, request features, and browse existing issues

View Issues

Enterprise Support

Direct support for enterprise customers and complex issues

Contact Support

Pro Tips for Getting Help

  • • Include the output of dcanary debug --export when reporting issues
  • • Provide your pipeline configuration (with secrets redacted)
  • • Include relevant build logs and error messages
  • • Specify your DCanary CLI version and operating system
  • • Check the FAQ and existing issues before posting new ones

CLI Commands Reference

Basic Commands

dcanary init [options]

Initialize a new DCanary project

dcanary pipeline create --name <name> --repo <repo>

Create a new pipeline

dcanary deploy [environment]

Deploy pipeline to network

dcanary status [pipeline-id]

Check pipeline status and recent builds

Advanced Commands

Secrets Management

dcanary secrets add <name> <value>
dcanary secrets list
dcanary secrets delete <name>

Executor Operations

dcanary executor start --daemon
dcanary executor join --stake 1000
dcanary executor stats

Logging & Debugging

View Build Logs

dcanary logs --follow <build-id>

Export Debug Info

dcanary debug --export --output debug.json

Validate Configuration

dcanary validate --config dcanary.yaml

Canister APIs

Webhook Canister API

// Trigger a pipeline execution
type TriggerRequest = {
  repository: string;
  branch: string;
  commit_sha: string;
  event_type: "push" | "pull_request" | "tag";
};

// Get pipeline execution status
type StatusResponse = {
  build_id: string;
  status: "pending" | "running" | "success" | "failed";
  executors: Array<ExecutorResult>;
  consensus_reached: boolean;
};

Webhook API

Webhook Endpoints

POST /webhook/github/{pipeline_id}

GitHub webhook endpoint for pipeline triggers

GET /status/{build_id}

Get build execution status and logs

Custom Executors

Running Your Own Executor

# Install executor agent
npm install -g @dcanary/executor

# Configure executor identity
dcanary executor init --identity ~/.config/dfx/identity/default

# Start executor daemon
dcanary executor start --daemon

# Join the network
dcanary executor join --stake 1000

Network Configuration

Network Selection

Local Development

# Start local replica
dfx start --background

# Deploy to local network
dcanary deploy --network local

Mainnet Production

# Configure mainnet
dcanary config set network ic

# Deploy to mainnet
dcanary deploy --network ic

Performance Tuning

Build Optimization

Caching Strategies

cache:
  paths:
    - node_modules/
    - ~/.cargo/
    - target/
  key: "${checksum:package-lock.json}"

Parallel Execution

stages:
  - name: test
    parallel:
      - npm run test:unit
      - npm run test:integration
      - npm run test:e2e

Performance & Best Practices

Build Optimization Tips

⚡ Speed Optimizations

  • • Enable dependency caching
  • • Use minimal Docker base images
  • • Implement parallel job execution
  • • Optimize test suite performance

🔒 Security Essentials

  • • Never commit secrets to code
  • • Use role-based access control
  • • Regularly audit permissions
  • • Monitor for security vulnerabilities

Configuration Examples

High-Performance Setup

performance:
  cache:
    enabled: true
    paths: ["node_modules/", ".cache/"]
  
  resources:
    cpu: "4.0"
    memory: "8Gi"
  
  parallel_jobs: 3

Production Security

security:
  secrets_encryption: true
  network_isolation: true
  audit_logging: enabled
  
  access_control:
    require_2fa: true
    session_timeout: "8h"