About Daestro
Daestro is a versatile cloud orchestration platform that enables you to run your computing tasks across various cloud services like AWS, DigitalOcean, Linode and Vultr, as well as on your own self-hosted servers. It acts as a central control panel to manage all your compute workloads regardless of where they are physically located.
To begin using Daestro, you can sign up for the Daestro Console using your GitHub account. The console serves as your central dashboard for managing all the platform's features, from connecting your cloud accounts to defining and running your jobs.
Daestro currently supports a range of popular cloud providers, including Amazon Web Services (AWS), Vultr, DigitalOcean, and Linode. You can connect your accounts from any of these providers to start running your workloads on their infrastructure through the Daestro platform.
A Cloud Auth is a secure credential you create within Daestro by providing an API key from your cloud provider account (like AWS or DigitalOcean). This allows Daestro to programmatically access your cloud resources on your behalf to provision and manage servers for your jobs. Your API keys are always stored in an encrypted format for security.
You are responsible for the costs of the underlying cloud infrastructure consumed when Daestro runs jobs on your behalf. Daestro uses your own cloud provider account to provision resources, so all usage fees from providers like AWS or DigitalOcean are billed directly to you, while Daestro only charges a separate fee for using its platform.
A Compute Environment acts as a template or specification for the servers that will run your jobs. It defines critical parameters such as the cloud provider to use (via a Cloud Auth), the server type (e.g., t2.micro on AWS), and the geographical location or region for the server. You can also create a special 'Self-hosted' compute environment to run jobs on your own hardware.
Yes, you can run jobs on your own machines, whether it's a laptop, a desktop, or an on-premise server. To do this, you create a 'Self-hosted' Compute Environment, generate an Agent Auth Token, and then run the Daestro Agent in a Docker container on your machine, securely linking it to the platform.
A Job Queue is a management tool that controls how and when your jobs are executed. It allows you to set the priority of different queues, limit the number of jobs that can run at the same time (concurrency), and associate specific Compute Environments that will be used to process the jobs within that queue.
A Job Definition is a detailed blueprint for a task you want to run. It specifies everything Daestro needs to know to execute your job, including the Docker image to use, the specific commands to run inside the container, required environment variables, and resource settings like an execution timeout.
Yes, Daestro supports creating Job Definitions based on bash scripts directly. When you choose the 'Bash Script' type, you can write your script in the console, and Daestro will automatically run it within a default `ubuntu:24.04` container, simplifying tasks that don't require a custom Docker image.
You can make your jobs dynamic by using Command Parameters, which act as placeholders in your Job Definition's command, formatted as `Param::`. You can then provide the actual values for these parameters either in the Job Definition itself or, more flexibly, override them with new values each time you submit a new job.
A Job Definition is the reusable template or blueprint that outlines what a task does, including the Docker image and commands. A Job, on the other hand, is the actual, runnable instance that is created based on a Job Definition. You can run many Jobs from a single Job Definition.
Daestro allows you to automate job execution by creating Cron Jobs. You can define a standard cron expression to specify a recurring schedule, such as running a task every hour or once a day. Each time the schedule is met, Daestro will automatically submit a new job based on the Job Definition and Job Queue you specify.
A Compute Spawn represents an actual server or virtual machine that is linked to the Daestro platform to execute jobs. This can be a cloud instance that Daestro automatically provisions and terminates for you based on your Job Queue's needs, or a self-hosted machine that you have manually connected using the Daestro Agent.
Daestro manages scaling through its Job Queue and Compute Spawn system for cloud environments. It automatically provisions new cloud-based Compute Spawns (servers) when jobs are submitted to a queue and terminates them after a configurable idle period, ensuring you only pay for resources when they are actively needed.
To use images from a private repository like Docker Hub or a self-hosted registry, you must first create a Container Registry Auth. In this step, you securely store the necessary credentials (like a username and password or access token). You can then associate this credential with your Job Definition, allowing Daestro to pull the private image at runtime.
Yes, Daestro provides RESTful APIs that allows you to interact with its services programmatically. You can generate an API key from the Daestro Console and use it to authenticate your requests, enabling you to build integrations and automate workflows such as submitting, monitoring, and canceling jobs.
Job Definitions are immutable, but you can create new versions of them by creating a 'revision'. This process creates a new Job Definition with the same name but an incremented version number, allowing you to introduce changes, test them, and gradually migrate your workflows without interrupting any jobs that are currently running using the older version.