Burla is a library for running python functions on (lots of) computers in the cloud.
Quickstart:
To install, run:
pip install burla
To create an account/login, run:
burla login
Click the "start cluster" button at cluster.burla.dev
Once booted, try the following basic example:
What is Burla:
Burla is kind of like AWS Lambda, except it:
deploys code in seconds
is invoked like a normal local python function
lets you run code on any hardware, and change it change on the fly / per request
lets you run code in any custom docker/OCI container
has no limit on runtime (lambda has a 15min limit)
is open-source, and designed to be self-hosted
To use Burla you must have a cluster running that the client knows about. Currently, our library is hardcoded to only call our free public cluster (cluster.burla.dev). Right now, this cluster is configured to run 16 nodes, each with 32 cpus & 128G ram.
Burla clusters are multi-tenant/ can run many jobs from separate users. Nodes in a burla cluster are single-tenant/ your job will never be on the same machine as another job.
Components / How it works:
Burla's major components are split across 4 separate GitHub repositories.
Burla The python package (the client).
main_service Service representing a single cluster, manages nodes, routes requests to node_services.
node_service Service running on each node, manages containers, routes requests to container_services.
container_service Service running inside each container, executes user submitted functions.
Last updated