Last updated
Last updated
With Burla, anyone can scale arbitrary code over thousands of virtual machines in the cloud. It's open-source, requires almost no setup, and is so simple even complete beginners can use it.
Burla is a library with only one function: remote_parallel_map
.
Given a python function, and a list of arguments, remote_parallel_map
calls the function, on every argument in the list, at the same time, each on a separate virtual machine in the cloud.
Here's an example:
In the above example, each call to my_function
runs on a separate virtual machine, in parallel.
With Burla, running code on remote computers feels the same as running locally. This means:
Any errors your function throws will appear on your local machine as they normally would.
Anything you print appears in your local stdout, just like it normally does.
Responses are pretty quick (you can run a million simple functions in a couple seconds).
Click here to learn more about remote_parallel_map.
Burla is open-source cluster-compute software designed to be self-hosted in the cloud.
To use Burla you must have a cluster running that the client knows about. Currently, our library is hardcoded to only call our free public cluster (cluster.burla.dev) which we've deployed to make Burla easy for anyone to try. This cluster is currently configured to run 16 nodes, each with 32 cpus & 128G ram.
Burla clusters are multi-tenant/ can run many jobs from separate users. Nodes in a burla cluster are single-tenant/ your job will never be on the same machine as another job. Click here to learn more about how burla-clusters work.
Questions? Schedule a call with us, or email jake@burla.dev. We're always happy to talk.